The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    *** Official Clevo P770ZM / Sager NP9772 and P770ZM-G / Sager NP9773 Owner's Lounge ***

    Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by HTWingNut, Jan 6, 2015.

  1. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    k.

    next on the agenda: hostile takeover of nGreedia's headquarters.
     
    TomJGX likes this.
  2. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I agree, boss.

    Also read my addendum as it pertains to you.
     
    noteless and TomJGX like this.
  3. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    WOW.

    #nVidia
    #BestCompany
     
  4. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    They making a strong push for this year's award

    [​IMG]
     
  5. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Let me guess, 990M won't support SLI either. I bet it'll be gsync-only too. =D.
     
    ajc9988 likes this.
  6. Chrack

    Chrack Notebook Consultant

    Reputations:
    27
    Messages:
    135
    Likes Received:
    126
    Trophy Points:
    56
    The same problem have i too but now it´s gone.
    I buyed the P771ZM-G Sync without OS. I installed Win8.1 but without UEFI Setup in Bios and NVidia 353.30 Driver.
    Some Times i become the black screen on startup. shutting laptop off the power button.and after i start it up again,it boots in like 4 seconds,a s if it was in sleep.
    On startup i see every time "resume from Hibernation"

    I upgradet to Win10 with NVidia Driver 355.60 and have the same problem again from time to time.
    I read with the damaged Alienware Notebooks and i go back to Win8.1.
    BUT...

    This Time i make a new clean install with Win8.1 and now with UEFI install but the same NVidia 353.30 Driver.
    And now my Problem is gone. No blackscreen and no "resume from hibernation" since last 7 Days o_O

    I checked my RAW Data with "Monitor Asset Manager" and there are ok with www.edidreader.com
    "valid Checksume: TRUE"

    @t456 said:
    I dont know why is my Problem gone, perhaps Hibernation make some wrong?
     
    ajc9988 likes this.
  7. TomJGX

    TomJGX I HATE BGA!

    Reputations:
    1,456
    Messages:
    8,707
    Likes Received:
    3,315
    Trophy Points:
    431
    Can someone please a make a video showing to remove the keyboard and access the 2 RAM slots underneath it? I want to get 32GB RAM lol..
     
  8. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    It's been almost 1 year since the release of DSR and the SLI+G-Sync+DSR combo is still not possible. I think Nvidia is preemptively disabling SLI of G-Sync MXM cards until they figure that one out (if they ever do). Well at least this is my hypothesis.
     
  9. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I still feel that SLI is getting less and less love from both nVidia's side and from developers' sides. I think they're probably waiting for pascal to design their cards for SFR support or something. I cannot see how so many technologies just doesn't work with the literal only way to get more GPU power on a PC.
     
  10. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    No disagreement from me here. SLI just isn't as viable as it was a few years ago when I got my Y500.
     
  11. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Yeah. When I got it, I literally expected every game to use it to some degree (and so said so done) and I was more surprised than not to hear that some games were bad with it. Fast forward to late 2014 and suddenly it's a "meh, maybe SLI will work for these new games". Especially with the stupidly popular Unity Engine (single threaded, single GPU engine which I have NO IDEA why people like) and the blooming popularity of Unreal Engine 4, SLI is just going down the drain. Maxwell GPUs' issues with voltage and clockspeeds in SLI is an entire other matter, as well as slaying of new tech (DSR + Gsync + SLI as you said, as well as MFAA) and old tech (SLI-forced 64x CSAA, etc) just makes me want to make sure people have at least a 980Ti before they even consider SLI.

    Did I mention to you how I couldn't get MGSV: Ground Zeroes to use more than 70-80% of my GPUs, and I had to overclock my GPUs to get 60fps constant at max in it? Literally, OCing my GPUs caused the FPS to raise and stick there (without drops) because ~75% of 850/5000 vs 75% of 1006/6000 is a fairly large difference. Wasn't even CPU limited in the slightest.
     
  12. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    But MFAA replaces CSAA. Except MFAA doesn't work in SLI. :rolleyes:

    No I don't remember you telling me that. That's really weird.
     
    TomJGX likes this.
  13. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I know right? :rolleyes:

    Yeah, it surprised me too. But the proof is there; I alt-tabbed from the game and simply double-clicked on my 1006/6000 profiles and instantly FPS went up and stuck there. Before I was getting ~55fps with drops to as low as 45. OC and min fps was something like 58. My CPU was being used decently, but it wasn't near max and I wasn't limited at all, nor did my CPU load increase when I overclocked. The engine itself (at least with 353.06 drivers) simply did not allow me to pass about 80% util, and mostly hovered near 70%.

    [​IMG]
     
  14. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    I recently purchase an ASUS ROG PG278Q G-sync LCD. DisplayPort keeps dropping out while playing Witcher 3. I can't figure it out. Driving me nuts.
     
  15. Fastidious Reader

    Fastidious Reader Notebook Evangelist

    Reputations:
    3
    Messages:
    365
    Likes Received:
    41
    Trophy Points:
    41
    Thinking on SLI as it's a tech that has even with us since the late 90s (had a model of Voodoo 2 pci card that supported the feature) I wonder what hurdles still exist to getting it to work correctly. I remeber watching someone play Far Cry 4 and their problem was that it was processing some effects twice doubling their visual effect.
     
    ajc9988 likes this.
  16. Afif Aziz

    Afif Aziz Notebook Enthusiast

    Reputations:
    0
    Messages:
    49
    Likes Received:
    8
    Trophy Points:
    16
    About that Nvidia MXM GTX 980m, it does have a resistor of some sort to differentiate between a G-Sync and non G-Sync, so im waiting for a new GPU card from them to replace mine, it could probably a driver from recent update may killed the GPU. They still post mortem that thing.
     
    TomJGX likes this.
  17. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    It's probably because the rendering methods and engine tech as well as bandwidth requirements have increased. For example, split-frame rendering and scissor-frame rendering were around back then, but now way too much data has to be transferred. Not only in required memory bandwidth, but in the amount of memory (64MB cards are really different to 2GB/4GB/8GB cards, after all =D).

    Cards weren't meant to transfer so much data across the PCI/e interface to each other, so we're stuck. nVidia in all their beautiful "DX12! DX12! Yay DX12!" hullabaloo "forgot" to design their current gen "DX12 compatible" cards for such things, when the butt of all jokes AMD seems to be far more ready for it.
     
    TomJGX and ajc9988 like this.
  18. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    that's because Microsoft adopted elements of mantle to make directx 12. Have you seen how 290x does against 980ti in ashes?
     
  19. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    That's a different story. That's because Maxwell GPUs apparently suck at parallel processing, and AMD cards don't, and thus they pulled ahead greatly.
     
  20. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    That is because Maxwell was purposely gimped to get hyper focused on directx 11, Nvidia asked it would still rock over amd, then is blaming the software designer because they're infallible. Nvidia thought Microsoft would continue to suppress amd, mantle made it impossible if Microsoft was to respond quickly!
     
  21. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    wat.

    I understand the gist of what you're saying, but you need sleep XD
     
  22. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Although I only got 4 hours of sleep, they removed some functionality to add to their Maxwell architecture for gaming. What they removed from prior design is what made them less than amd for bitcoin mining. The same feature for bitcoin mining, parallelism in part, is what makes amd better at the new standard. Deal with Nvidia f*ed themselves in hyper-focusing on directx 11. It's fact. Amd is cutting edge on standards that are suppressed unless so good everyone adopts it. Here, Microsoft has no choice but to adopt asynchronous processing done like amd while better utilizing the parallel nature of so many cores. Look it up!

    Edit: this isn't to say Nvidia doesn't have good cards, just that it's cards lose efficacy on a new api they didn't design for. Amd also developed hUMA and HSA in 2011-2012 time frame. Intel and Nvidia helped slow adoption. Now, Intel is backing freesync and hsa will allow the CPU, to a degree, to utilize hbm2 on Zen apus. It will be interesting to see if amd beats Pascal with both hbm2, but amd having years of honing what makes it good at directx 12, mantle, and Vulkan.
     
    Last edited: Aug 31, 2015
    D2 Ultima likes this.
  23. Chrack

    Chrack Notebook Consultant

    Reputations:
    27
    Messages:
    135
    Likes Received:
    126
    Trophy Points:
    56
    Small Spoiler from me :p

    [​IMG]
    4,0Ghz @1,02V = 61c auto Fan / 55c max Fan


    [​IMG]
    4,5Ghz @1,14V = 70c auto Fan / 65c max Fan

    XTU Stresstest with 20c Roomtemp ;)
    Look my Signature how are the Temps before with old Mods!
    @4,5Ghz the CPU runs 8c cooler with max fans than before

    I also make some Firestrike test with 4,5Ghz CPU and stock GPU only for Temp test.
    Fan auto = GPU 48c and CPU 65c
    Fan max = GPU 42c and CPU 61c
     
    ajc9988 likes this.
  24. TomJGX

    TomJGX I HATE BGA!

    Reputations:
    1,456
    Messages:
    8,707
    Likes Received:
    3,315
    Trophy Points:
    431
    Wow what does your 4790K run on? That is such low voltage... Even my 3940XM can't run at that voltage for those speeds...
     
  25. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    In the specific case of Far Cry 4, the ghosting in SLI was because temporal SMAA was enabled. Ubisoft fixed it 5 months after release with patch 1.10.0.

     
  26. Samot

    Samot Notebook Evangelist

    Reputations:
    224
    Messages:
    610
    Likes Received:
    334
    Trophy Points:
    76
    Yeah, i also found that odd... -30mv at 4.0ghz and then -80mv at 4.5ghz?!?
     
  27. Chrack

    Chrack Notebook Consultant

    Reputations:
    27
    Messages:
    135
    Likes Received:
    126
    Trophy Points:
    56
    @TomJGX
    It´s a selected 4970K that i buyed from another User.
    Max. OC that i have testet on my PC was 4,8GHz with 1,310V Prime stable ;)
    I didn´t test higher.

    But in my P771ZM-G are 4,5Ghz enough.
    I try to become good temperature as you can see in the Screens above.
    But this very nice temps comes not only from some Mods (delite cpu, heatspreader and heatsink polish) :rolleyes:

    Tell more if i done with my modification (WC) ;)
    For this i have buyed extra a new Heatsink and bottom cover, so i can try some thinks
     
    ajc9988 likes this.
  28. Chrack

    Chrack Notebook Consultant

    Reputations:
    27
    Messages:
    135
    Likes Received:
    126
    Trophy Points:
    56
    @Samot
    Why Not?

    4,0Ghz @ 1,024V is more than Prime stable. With 0,985V it´s not Prime stable only XTU
    4,5Ghz @ 1,142V is Prime Stable too. With 1,121V only XTU.

    But in my PC are the lower Voltage Prime stable and not only XTU in my P771.
    So i have to set the Voltage a bit higher in the P771
     
  29. Samot

    Samot Notebook Evangelist

    Reputations:
    224
    Messages:
    610
    Likes Received:
    334
    Trophy Points:
    76
    It´s just found those values (-80mv at 4.5ghz, -30mv at 4ghz not so much) surprisingly good. But take caution because that -80mv maybe be unstable on lower multi´s.
     
  30. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,189
    Likes Received:
    17,900
    Trophy Points:
    931
    Every chip is going to react differently, especially when they can have different stock voltages.
     
  31. Samot

    Samot Notebook Evangelist

    Reputations:
    224
    Messages:
    610
    Likes Received:
    334
    Trophy Points:
    76
    Yep, that´s right. Chrack seems to have a really good chip.
     
    TomJGX likes this.
  32. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,189
    Likes Received:
    17,900
    Trophy Points:
    931
    Yes looks that way :) Some very nice clocks.
     
  33. DaveFromGameaVision

    DaveFromGameaVision Notebook Consultant

    Reputations:
    9
    Messages:
    230
    Likes Received:
    89
    Trophy Points:
    41
    Is it possible to install a 120Hz panel on the P770ZM?
     
  34. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Theoretically, if you get the P770ZM with the IPS panel (so it has the eDP connector) and buy separately the LVDS LCD cover, then buy the 50-pin eDP cable and the 120Hz screen separately, then you should be able to install and mount it correctly.

    As for whether it will WORK or not... I dunno.
     
    DaveFromGameaVision likes this.
  35. DaveFromGameaVision

    DaveFromGameaVision Notebook Consultant

    Reputations:
    9
    Messages:
    230
    Likes Received:
    89
    Trophy Points:
    41
    Thank you for the info.

    So this model would have the eDP port right? I've got an Alienware 17 with the 120Hz screen right now, would the cable/panel from it possibly work with the P770ZM? What is the LVDS LCD cover?
     
  36. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    That model would work, correct.

    The LCD cover needs to be the one that houses the Chi Mei Innolux and the AUO panels. It's NOT the same one that houses that LG IPS panel. You should be able to order the LCD Cover from Eurocom (or maybe from RJTech if you can find it on their site). I don't know what your AW17 uses... if it's the LG LP173WF2-TPB1, then yes, it will work. You can use Moninfo and/or HWiNFO64 to check and see if the info matches up with that LG panel on www.panelook.com.
     
    DaveFromGameaVision likes this.
  37. DaveFromGameaVision

    DaveFromGameaVision Notebook Consultant

    Reputations:
    9
    Messages:
    230
    Likes Received:
    89
    Trophy Points:
    41
    Looks like I've got a Samsung panel right now which I don't think is compatible because it's a 40 pin connector, not a 50 pin like the LG you listed. So I would need the LG panel, a new cover and a cable I assume? Do you know the part number on the cable? Has anyone on here upgraded to a 120Hz? I've looked but I can't find anything.

    edit: According to notebookcheck the stock screen is this LG model correct? It's listed an eDP 2 lane screen, does that mean I can just plug the 120Hz LG in even though it is a 4 lane?
     
    Last edited: Sep 1, 2015
  38. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    No sorry, you're out of luck.

    You need the IPS laptop SPECIFICALLY because your board won't have the eDP connector otherwise, according to most people who checked.
     
    DaveFromGameaVision likes this.
  39. DaveFromGameaVision

    DaveFromGameaVision Notebook Consultant

    Reputations:
    9
    Messages:
    230
    Likes Received:
    89
    Trophy Points:
    41
    Yeah I figured the Alienware display working was a long shot, oh well. This is what I've got so far:
    1. I need the IPS version of the P770ZM so I have an eDP port.
    2. The P770ZM IPS screen is this model, I want to replace it with this model (or the glossy version depending on preference)
    3. Do I need a new cable connecting the display panel to the motherboard? The stock screen is a "2 lane" and the 120Hz is a "4 lane".
    4. I will need a new LCD cover (bezel?) from the non-IPS version to match the different sized screen.
    Thank you very much for your replies so far. I'm sorry if my questions are frustrating or dumb but I refuse to buy an expensive laptop if it has a poor quality screen and there is no going back after having a 120Hz screen (at least for me).
     
  40. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    1 - Correct. I.E. the P770ZM-G from Sager/Myth/etc, or the Eurocom model with the IPS panel selected.
    2 - Correct.
    3 - Yes, you need a 50-pin (4 lane) eDP connector. That IPS panel uses a 30-pin (2 lane) eDP connector.
    4 - Yes, you will need the LCD cover from the non-IPS version; the one that supports the Chi Mei and AUO panels. The reason is that the IPS panel has a different mounting orientation, but the P37xSM and P37xSM-A models used the Chi Mei and AUO panels that the P770ZM use, as well as the 120Hz panel, in the same LCD cover. So any LCD cover that can fit both the Chi Mei and AUO panels should fit the 120Hz panel.

    If you're willing to do the necessary modifications and attempt to get a 120Hz panel into that P770ZM you'd be doing this whole place a favour that nobody else has bothered going through XD.
     
    DaveFromGameaVision likes this.
  41. DaveFromGameaVision

    DaveFromGameaVision Notebook Consultant

    Reputations:
    9
    Messages:
    230
    Likes Received:
    89
    Trophy Points:
    41
    Alright, that is awesome news. I'm seriously considering picking one of these up, I'm tired of Dellianware restrictions and I'd like to see just how far my 980M will go. Where is the best place to get the eDP connector? I'd rather pick up that and the LCD first and worry about mounting it when it is confirmed working. :D
     
  42. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Eurocom is expensive, but will sell you the parts.
     
    DaveFromGameaVision likes this.
  43. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    What is your alienware 3D display? Is it from LG? Did you run HWiNFO and/or MonInfo?
     
  44. DaveFromGameaVision

    DaveFromGameaVision Notebook Consultant

    Reputations:
    9
    Messages:
    230
    Likes Received:
    89
    Trophy Points:
    41
    Yeah, HWiNFO confirmed it's a Samsung.
     
  45. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    And is it eDP or LVDS? MonInfo should also clarify this.
     
  46. DaveFromGameaVision

    DaveFromGameaVision Notebook Consultant

    Reputations:
    9
    Messages:
    230
    Likes Received:
    89
    Trophy Points:
    41
    It's EDP, here is a picture of the motherboard connectors.

    Monitor
    Manufacturer............. Samsung
    Plug and Play ID......... SEC5044
    Data string.............. GN36T€173HT [*CP437]
    Serial number............ n/a
    Manufacture date......... 2012, ISO week 1
    Filter driver............ None
    -------------------------
    EDID revision............ 1.4
    Input signal type........ Digital (DisplayPort)
    Color bit depth.......... 6 bits per primary color
    Color encoding formats... RGB 4:4:4
    Screen size.............. 380 x 210 mm (17.1 in)
    Power management......... Not supported
    Extension blocs.......... 1 (CEA-EXT)
    -------------------------
    DDC/CI................... Not supported

    Color characteristics
    Default color space...... Non-sRGB
    Display gamma............ 2.20
    Red chromaticity......... Rx 0.553 - Ry 0.318
    Green chromaticity....... Gx 0.352 - Gy 0.586
    Blue chromaticity........ Bx 0.165 - By 0.110
    White point (default).... Wx 0.313 - Wy 0.329
    Additional descriptors... None

    Timing characteristics
    Range limits............. Not available
    GTF standard............. Not supported
    Additional descriptors... None
    Preferred timing......... Yes
    Native/preferred timing.. 1920x1080p at 60Hz (16:9)
    Modeline............... "1920x1080" 146.870 1920 1968 2000 2140 1080 1083 1088 1144 +hsync -vsync

    Standard timings supported

    EIA/CEA-861 Information
    Revision number.......... 1
    IT underscan............. Not supported
    Basic audio.............. Not supported
    YCbCr 4:4:4.............. Not supported
    YCbCr 4:2:2.............. Not supported
    Native formats........... 0
    Detailed timing #1....... 1920x1080p at 100Hz (16:9)
    Modeline............... "1920x1080" 237.940 1920 1968 2000 2080 1080 1083 1088 1144 +hsync -vsync
    Detailed timing #2....... 1920x1080p at 110Hz (16:9)
    Modeline............... "1920x1080" 261.730 1920 1968 2000 2080 1080 1083 1088 1144 +hsync -vsync
    Detailed timing #3....... 1920x1080p at 120Hz (16:9)
    Modeline............... "1920x1080" 285.530 1920 1968 2000 2080 1080 1083 1088 1144 +hsync -vsync

    Reserved general related data

    Report information
    Date generated........... 9/1/2015
    Software revision........ 2.90.0.1000
    Data source.............. Real-time 0x0100
    Operating system......... 6.2.9200.2
     
    D2 Ultima likes this.
  47. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Okay, thanks very much.
     
  48. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,189
    Likes Received:
    17,900
    Trophy Points:
    931
    Look at all of those display chips near the connectors!
     
  49. DaveFromGameaVision

    DaveFromGameaVision Notebook Consultant

    Reputations:
    9
    Messages:
    230
    Likes Received:
    89
    Trophy Points:
    41
    Alright I think I found the cable part number: 6-43-P37E1-020-J. They are expensive ($155)! Plus the screen for ~$90. Any input on glossy vs matte? My AW17 screen is glossy and I don't mind it, is the matte coating pretty high quality? Any idea what part number I'm looking for on the cover or what to ask for?

    The non G model works as well right? I've already got a 980M which will not work with the G model.

    edit: Apparently the P770ZM is discontinued... anyone else know where I could get a barebones one besides rjtech? I've already got a 4790K in one of my computers I was going to use... I guess I have to wait to see if the P770DM gets a screen upgrade.

    second edit: If I got the G-sync model could I theoretically upgrade it to 120Hz or is G-sync dependent on the display panel.
     
    Last edited: Sep 2, 2015
    TomJGX likes this.
  50. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Non-G IPS panel model will work, but only Eurocom offered those.

    P770ZM is for the time being discontinued due to Skylake's existence, however I can't promise the Skylake refresh will have the same LCD cover that houses the LVDS chips. Eurocom might be able to help you out with the specifics of the system; they always have all the options. But they'll be expensive. You'll be able to say you can use your own CPU OR send them your CPU to use if you want.

    I believe it's display-panel dependent, but a BIOS upgrade might be able to handle that assuming you have the G-sync 980M *hint at Prema*
     
← Previous pageNext page →