The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    The Official MSI GT76 Titan Owners and Discussions Lounge

    Discussion in 'MSI Reviews & Owners' Lounges' started by joskei, Jun 19, 2019.

  1. joskei

    joskei Newbie

    Reputations:
    5
    Messages:
    8
    Likes Received:
    8
    Trophy Points:
    6
    I was originally looking to purchase a Clevo laptop (p870), but after talking with Donald at HIDEvolution, he has me really interested in the upcoming GT76.

    I thought I would reach out to MSI owners and see what their feelings are on this laptop as well as the reliability/performance of their MSI laptops. My experience with MSI has always been desktop component related, and the reliability/performance has been really hit or miss.

    Any input would be appreciated!

    EDIT: Just realized I posted this in the wrong forum. Mod, please move this to owner's lounge.
     
    Last edited by a moderator: Jun 26, 2019
    paulatoohey likes this.
  2. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,525
    Messages:
    5,349
    Likes Received:
    4,333
    Trophy Points:
    431
    I dont think there is an owners lounge since no one has it yet, it was just newly announced to the best of my knowledge.
     
  3. Support.3@XOTIC PC

    Support.3@XOTIC PC Company Representative

    Reputations:
    1,268
    Messages:
    7,186
    Likes Received:
    1,002
    Trophy Points:
    331
    Won't see one in the wild for a while, but if any of the hype plays out at all it's probably going to be the best of the 9th gen desktop replacements.
     
  4. joskei

    joskei Newbie

    Reputations:
    5
    Messages:
    8
    Likes Received:
    8
    Trophy Points:
    6
    So I initially chose a Clevo and Donald at HIDEvolution ending up selling me on this one (which was actually cheaper than my specced out Clevo). The process has been fantastic so far, so if anyone is like me and lurking from the interwebs looking for a high-end mobile computer, HIDEvolution has thus far been pretty awesome. I'm a software guy, so I only research hardware when I'm looking to purchase a new computer and Donald was very patient with a lot of my (surely) stupid questions.

    Pretty hyped for the machine, as this will be the highest end machine I've ever owned!

    Specs:

    LCD Panel

    17.3" FHD (1920X1080), 144Hz 3ms IPS-Level, 100% sRGB, 72% NTSC, 100% sRGB, Anti-Glare

    Video Card

    NVIDIA GeForce RTX 2080 w/ 8GB GDDR6

    Processor

    HIDevolution Delidded Intel 9th Gen Core i9-9900K 8 Core, 16 Thread, 16MB Cache, Processor, 3.6 - 5.0 GHz - installed by HIDevolution

    Thermal Interface Materials

    Thermal Grizzly Conductonaut between Intel CPU Die and IHS + GPU, Gelid GC Extreme between IHS + Heat Sink, and Fujipoly Extreme Thermal Pads on heat sensitive surfaces - installed by HIDevolution (ONLY SELECT THIS OPTION IF DELIDDED CPU IS CHOSEN)

    Memory

    HIDevolution Approved Standard 128GB Dual Channel DDR4/2666MHz (4 x 32GB)

    M.2 PCIe RAID Options

    no RAID

    M.2 SSD Slot 1 - Supports PCIe Only

    Samsung 970 EVO Plus Series - 1TB PCIe NVMe - M.2 Internal SSD (MZ-V7S1T0B/AM) (outside purchase)

    M.2 SSD Slot 2 - Supports PCIe / SATA

    ADATA XPG SX8200 Pro 1TB 3D NAND NVMe Gen3x4 PCIe M.2 2280 Solid State Drive R/W 3500/3000MB/s SSD (ASX8200PNP-1TT-C) (outside purchase)

    M.2 SSD Slot 3 - Supports PCIe / SATA

    None

    2.5" HDD/SSD Bay

    Crucial MX500 2.5" SSD 2TB (outside purchase)

    Wireless Cards

    Intel® Wi-Fi 6 AX200 2x2 w/ Bluetooth 5.0

    Operating System

    Genuine Windows® 10 Pro, 64bit, English

    Power Adapter

    2x 230W AC Power Adapter (supports 100-240V)

    Internal Battery

    Internal 8 cell (90Wh) battery

    Audio

    2W *2 + Subwoofer 3W *1

    Keyboard

    Steelseries Per-Key RGB Keyboard
     
    TheDantee, dzpliu and win32asmguy like this.
  5. xLima

    xLima Notebook Evangelist

    Reputations:
    132
    Messages:
    567
    Likes Received:
    280
    Trophy Points:
    76
    Congrats, amazing beast!

    Sent from my BLA-L09 using Tapatalk
     
    joskei likes this.
  6. joskei

    joskei Newbie

    Reputations:
    5
    Messages:
    8
    Likes Received:
    8
    Trophy Points:
    6
    I hope it lives up to the hype! :D
     
  7. Kriegprojekt

    Kriegprojekt Newbie

    Reputations:
    0
    Messages:
    9
    Likes Received:
    6
    Trophy Points:
    6
    How long for delivery and price?
     
  8. joskei

    joskei Newbie

    Reputations:
    5
    Messages:
    8
    Likes Received:
    8
    Trophy Points:
    6
    As early as July 10th. As configured, it is running me ~$5k between HIDEvolution, Amazon, and Newegg.
     
  9. Kaibaman

    Kaibaman Notebook Enthusiast

    Reputations:
    5
    Messages:
    10
    Likes Received:
    2
    Trophy Points:
    6
    Hey man I'm thinking of getting this as well for my first high end laptop (my other possible choice is Acer predatory helios 700, but both seem the best when it comes to laptop power and thermals)
    Just wondering why did you get Samsung for the 1st SSD slot and ADATA for the 2nd slot?

    Looking forward to your impressions on this beast!
     
  10. joskei

    joskei Newbie

    Reputations:
    5
    Messages:
    8
    Likes Received:
    8
    Trophy Points:
    6
    The reviews I've read place the Adata as slower than the Samsung. However, I'm going to get the Adata first and run some benchmarks against that drive and see what speeds I get in the laptop. If it performs pretty close, I'm going to save a $100 and just get another Adata. That Acer looks pretty slick, but if this MSI lives up to even half of the hype, I think I'm going to be pretty happy with it. I just hope I don't get screwed like I usually do with pre-orders. :mad:

    When I finally get the laptop, I'm going to benchmark the stock speeds, check out the overclock and benchmark that. I'll post the results here when I have them, so I can be your guinea pig if you don't mind waiting.
     
  11. xLima

    xLima Notebook Evangelist

    Reputations:
    132
    Messages:
    567
    Likes Received:
    280
    Trophy Points:
    76
    Please let me know about the sx 8200 pro. I installed mine yesterday and it does not match up to all the reviews I have seen on it. Perhaps I got a dud. 2500R/2400W

    Sent from my BLA-L09 using Tapatalk
     
  12. B0B

    B0B B.O.A.T.

    Reputations:
    477
    Messages:
    1,132
    Likes Received:
    1,363
    Trophy Points:
    181
    The Ultimate Laptop (desktop replacement) deserves its own thread!



    CPU
    Up to 9th Gen. Intel® Core™ i9 socketed LGA Desktop Processor

    OS
    Windows 10 Home
    Windows 10 Pro (MSI recommends Windows 10 Pro)

    DISPLAY

    17.3" UHD (3840x2160), IPS-Level
    17.3" FHD (1920x1080), 144Hz, IPS-Level

    CHIPSET

    Intel® Z390

    GRAPHICS
    NVIDIA® GeForce RTX™ 2080 with 8GB GDDR6
    NVIDIA® GeForce RTX™ 2070 with 8GB GDDR6

    MEMORY
    DDR4-2666 Memory Type
    4 Slots Number of DIMM Slot
    Max 128GB Max Capacity

    STORAGE
    1x M.2 SSD slot (NVMe PCIe Gen3)
    2x M.2 SSD Combo slot (NVMe PCIe Gen3 / SATA)
    1x 2.5" SATA HDD

    WEBCAM
    HD type (30fps@720p)

    KEYBOARD

    Per-Key RGB Backlight Keyboard

    AC ADAPTER

    2x 230W adapter + Converter Box

    DIMENSION (WXDXH) MM

    397 x 330 x 33~42 mm
    15.6" x 13" x 1.3"-1.7" thick

    WEIGHT (W/ BATTERY)

    4.5 kg
    9.9lbs

    I/O PORTS

    1x Type-C (USB3.2 Gen2 / DP / Thunderbolt™3)
    4x Type-A USB3.2 Gen2
    1x Type-C USB3.2 Gen2
    1x RJ45
    1x Micro SD
    1x (4K @ 60Hz) HDMI
    1x Mini-DisplayPort

    AUDIO JACK
    1x Mic-in
    1x Headphone-out (HiFi / SPDIF)

    COMMUNICATION
    Killer Gb LAN
    Killer ax Wi-Fi + Bluetooth v5

    AUDIO
    2x 2W Speaker
    1x 3W Woofer

    BATTERY

    8-Cell
    Li-Ion Battery (Type)
    90 Battery (Whr)
    GT76-1.png GT76-2.png GT76-3.png GT76-4.jpg GT76-5.jpg
    WHERE TO BUY
    Coming Soon as more e-tailers have availability
     
    Last edited: Sep 4, 2019
  13. B0B

    B0B B.O.A.T.

    Reputations:
    477
    Messages:
    1,132
    Likes Received:
    1,363
    Trophy Points:
    181
    I'm working on further details about this chassis. Such as it's my understanding this does NOT have G-Sync according to those with hands on time at Computex, but time will tell.
     
    Last edited: Aug 18, 2019
    robsternation and Arrrrbol like this.
  14. Ale380

    Ale380 Notebook Enthusiast

    Reputations:
    5
    Messages:
    18
    Likes Received:
    16
    Trophy Points:
    6
    Hey congrats on your new system! Do you guys know where to preorder that in europe..?
     
  15. Donald@Paladin44

    Donald@Paladin44 Retired

    Reputations:
    13,989
    Messages:
    9,257
    Likes Received:
    5,843
    Trophy Points:
    681
    You are correct, the GT76 does not support G-Sync.
     
    Arrrrbol and B0B like this.
  16. Falkentyne

    Falkentyne Notebook Prophet

    Reputations:
    8,396
    Messages:
    5,992
    Likes Received:
    8,633
    Trophy Points:
    681
    What's the reason for this? Something to do with Optimus? If this laptop does not support Optimus, Gsync should theoretically work unless Nvidia didn't get their money. But it's unclear whether this is optimus or Mux'd.

    The GT73VR also "did not support Gsync"--there was not even a mention of Gsync on the MSI homepage either-- when it was released also, as well as prerelease notes of the GT75 Titan, and that caused a nice crapstorm all over several forums. And that was because the panel had not been Gsync certified yet (meaning the Green Goblin had not been paid his luxury tax yet). But the GT73 was hardware capable of gsync, and eventually MSI released a bios to make it work.
     
    B0B likes this.
  17. Stephen1892

    Stephen1892 Notebook Enthusiast

    Reputations:
    0
    Messages:
    16
    Likes Received:
    10
    Trophy Points:
    6
    I am actually a little put off with this not having gsync. I'm looking to replace my gt73vr 7rf but I like having Gsync for older titles that I still play.
     
  18. robsternation

    robsternation Notebook Enthusiast

    Reputations:
    5
    Messages:
    13
    Likes Received:
    11
    Trophy Points:
    6
    Just got off the phone with Donald at HIDevolution. Ordered a sweet GT76 with 9900k and RTX 2080. Probably getting it sometime mid July. Cannot wait!
     
  19. B0B

    B0B B.O.A.T.

    Reputations:
    477
    Messages:
    1,132
    Likes Received:
    1,363
    Trophy Points:
    181
    Oh man!! Yes!!
     
  20. robsternation

    robsternation Notebook Enthusiast

    Reputations:
    5
    Messages:
    13
    Likes Received:
    11
    Trophy Points:
    6
    I am super stoked. Just looking at the new acres of ventilation on the bottom is sending a thrill up my leg. :D
     
    Donald@Paladin44 likes this.
  21. B0B

    B0B B.O.A.T.

    Reputations:
    477
    Messages:
    1,132
    Likes Received:
    1,363
    Trophy Points:
    181
    It looks like the right amount of absurdity Lol
     
  22. robsternation

    robsternation Notebook Enthusiast

    Reputations:
    5
    Messages:
    13
    Likes Received:
    11
    Trophy Points:
    6
    :) :)The specs on my new baby:


    Custom Built MSI GT76 Titan DT 9SG-006 - 17.3" FHD 144Hz / 4K 60Hz - i9-9900K - RTX 2080

    Loot Box Level 2 Bundle

    Gaming Bag, Dragon Doll, Gaming Headset

    Free Game Bundles

    NVIDIA RTX Wolfenstein Youngblood Bundle for products with RTX 2060, 2070, 2080, 2080 Ti

    LCD Panel

    17.3" FHD (1920X1080), 144Hz 3ms IPS-Level, 100% sRGB, 72% NTSC, 100% sRGB, Anti-Glare

    Display Warranty

    30 Days Zero Defective Pixel Warranty (perfect panel guarantee)

    Video Card

    NVIDIA GeForce RTX 2080 w/ 8GB GDDR6

    Processor

    HIDevolution Delidded Intel 9th Gen Core i9-9900K 8 Core, 16 Thread, 16MB Cache, Processor, 3.6 - 5.0 GHz - installed by HIDevolution

    Thermal Interface Materials

    Thermal Grizzly Conductonaut between Intel CPU Die and IHS + GPU, Gelid GC Extreme between IHS + Heat Sink, and Fujipoly Extreme Thermal Pads on heat sensitive surfaces - installed by HIDevolution (ONLY SELECT THIS OPTION IF DELIDDED CPU IS CHOSEN)

    Memory

    None

    M.2 PCIe RAID Options

    "Super RAID 4" RAID 0 (striping) - Must select 2x identical drives in M.2 SSD Slots 1 and 2

    M.2 SSD Slot 1 - Supports PCIe Only

    MSI Approved 512GB M.2 PCIe 3.0 x4 NVMe SSD

    M.2 SSD Slot 2 - Supports PCIe / SATA

    None

    M.2 SSD Slot 3 - Supports PCIe / SATA

    None

    2.5" HDD/SSD Bay

    Empty HDD bay (with caddy/adapter)

    Wireless Cards

    Intel® Wi-Fi 6 AX200 2x2 w/ Bluetooth 5.0 - installed by HIDevolution -

    Operating System

    Genuine Windows® 10 Pro, 64bit, English

    Operating System Clean Install

    Yes, Clean Install of Windows 10 including 32GB Clean Install Flash Drive

    Office Software

    None

    Power Adapter

    2x 230W AC Power Adapter (supports 100-240V) + Converter Box

    Internal Battery

    Internal 8 cell (90Wh) battery

    Audio

    2W *2 + Subwoofer 3W *1

    Keyboard

    Steelseries Per-Key RGB Keyboard



    --------------------------------------------------------------------------------------------------------------------------



    I ordered it without RAM and only one SSD because I'm going to be transferring over those related items from my current rig.

    Aint she a beaut? :)

    Edit: Also, a shout-out to Donald at HIDevolution. He was super helpful, spent as much time as needed explaining everything in detail, and was super nice.
     
    Donald@Paladin44 and B0B like this.
  23. B0B

    B0B B.O.A.T.

    Reputations:
    477
    Messages:
    1,132
    Likes Received:
    1,363
    Trophy Points:
    181
    ^^ Yeah that’ll do just fine!
     
  24. HyperStryker

    HyperStryker Notebook Enthusiast

    Reputations:
    11
    Messages:
    40
    Likes Received:
    33
    Trophy Points:
    26
    Congrats on the new system! I am currently about to bite the bullet on getting a very similar configuration to your's and was wondering if you opted for the minimal backlight bleed and display warranty. If so, what did you choose? If not, why did you choose not to? I've been reading up on past forum posts and most people seem to suggest getting it, but I would like to keep the initial cost of the machine down, so I've been heavily debating if it's worth it or not. Thanks.
     
  25. Terreos

    Terreos Royal Guard

    Reputations:
    1,170
    Messages:
    1,846
    Likes Received:
    2,260
    Trophy Points:
    181
    This is a very interesting laptop to be sure. And it really seems to be kicking the Area 51m in the happy sack. Nice clean design, more m.2 slots, better cooling, promised 5ghz on 9900k from the factory. No G-Sync does hurt it in my opinion by alot though. This kind of hardware screams 4K gaming. And G-Sync would be most welcome at that Resolution. I'll wait to see a few reviews here on the forums and maybe I'll give @Donald@HIDevolution a call. ;)

    Honestly a middle ground 1440p 120hz panel would go a long way I think. You listening MSI? o_O
     
    rickybambi likes this.
  26. Stephen1892

    Stephen1892 Notebook Enthusiast

    Reputations:
    0
    Messages:
    16
    Likes Received:
    10
    Trophy Points:
    6
    The no Gsync has totally put me off. I don't understand why they wouldn't have it my gt73vr 7rf has it. It would feel like a step back.
     
  27. HyperStryker

    HyperStryker Notebook Enthusiast

    Reputations:
    11
    Messages:
    40
    Likes Received:
    33
    Trophy Points:
    26
    I'm really happy that MSI finally came out with a GT laptop with a desktop CPU. I've been put off mobile CPU variants due to their more limited performance in comparison to desktops, as I am looking to transition from a desktop to a desktop replacement notebook. Because of this, I am very close to purchasing this laptop (w/ 9900k, 2080, 144hz panel) from either HID or Xotic, whom differ a bit on price. Once I get that nailed down, hopefully it won't be long till I get to have fun with such a massive upgrade over my older desktop.
     
    robsternation likes this.
  28. robsternation

    robsternation Notebook Enthusiast

    Reputations:
    5
    Messages:
    13
    Likes Received:
    11
    Trophy Points:
    6
    One of the reasons that I chose to go with HIDevolution is that they offer delidding for the 9900k, and liquid metal for both the CPU and GPU. Heat dissipation is one of the biggest challenges for using these chips in a laptop, IMO. A straight-out-of-box desktop 9900k is well known to run really hot, due to the way it's constructed (and that's in a desktop, so it's even a bigger issue for a laptop), and of course the RTX 2080 is another step up in terms of laptop graphics performance but hence power draw.

    Personally, I hated the idea of buying this type of rig and have it thermal throttle on me, which would kinda defeat the purpose of it, so...yeah. But I suppose it depends on the individual priorities and expectations of the user.
     
  29. HyperStryker

    HyperStryker Notebook Enthusiast

    Reputations:
    11
    Messages:
    40
    Likes Received:
    33
    Trophy Points:
    26
    Although the 9900k does kinda run hot, it is utilizing solder for the TIM. GamersNexus tested delidding vs not and the difference was not massive, so I decided to not go with that option.

    Sent from my SM-N950U using Tapatalk
     
  30. joskei

    joskei Newbie

    Reputations:
    5
    Messages:
    8
    Likes Received:
    8
    Trophy Points:
    6
    I got the "30 day no dead pixel" and did NOT opt for the backlight bleed. In my personal experience, if a display is going to go bad, it usually happens in the first 30 days (or arrives that way), so I'll be checking very closely for the first 30 days. That's not to say down the road it might not lose a pixel or two (or even just go out), but everything performs worse the older it gets. I'm using 2 cheap Acer monitors that have seen 10-12 hours of use a day for like 5 years and I've had no issues. I'm hoping (maybe naively) that on such a premium piece of kit, the display will reflect the premium I'm paying for it.

    I decided against the backlight because, for me, the displays with bleed don't really bother me. It's going to be a personal thing. If it really bothers you, spring for the additional coverage. You're never going to eliminate all bleed on an IPS panel, and I figured it wasn't worth the extra money to guarantee I'm in the top 20% of panels. Here is a new review I found a few days ago: https://www.notebookcheck.net/MSI-GT76-9SG-Laptop-Review-The-Titan-of-Gaming-Laptops.425043.0.html and one of the things that concerned me was this portion: "The only blemish that we could find is the brightness distribution of 89%, which looks okay on paper, but in reality there is quite a bit of screen bleeding along the edges of the screen." The other was that they stated the response time was between 8-10ms, whereas it's advertised at 3ms. I'm unsure if I ate some marketing hype, or if the review model has different specs.
     
    HyperStryker likes this.
  31. robsternation

    robsternation Notebook Enthusiast

    Reputations:
    5
    Messages:
    13
    Likes Received:
    11
    Trophy Points:
    6
    I don't think this is quite right. Please see the following video from GamersNexus:



    The chart at 6:55 shows temp results of delidding vs. solder, with discussion both before and after by Steve.

    In a nutshell, the chart shows that the delid achieves about a 4 degrees C decrease in temps vs. solder, which I am assuming is what you are referring to when you say that the difference is not massive. But this is where relying on the numbers alone without context can be highly misleading.

    First, a 4 degrees difference even on its face is significant, inasmuch as the difference between solder and paste in a 9900k is only about 5 degrees, and Intel evidently believed that 5 degrees was well worth the effort of switching to solder to achieve.

    Second, Steve openly admits that they are amateurs/inexperienced at delidding a 9900k CPU, which is a different (a far more difficult, and risky) kettle of fish than delidding a previous generation chip like the 8700k. He is very open about the fact that someone more technically proficient at it, e.g. der8auer, can achieve much better results, and in fact der8auer has gotten decreases of something like 10 degrees C, or more.

    If you add together the decreases in temps from delidding and the solder (10 and 5), you will have essentially reconstituted the 15-20 degrees decreases that were achieved in delidding/LM of previous gen chips, and this should make theoretical sense.

    Steve pretty much acknowledges all of this in the video. It's applied science, after all. When he says that he doesn't recommend that most people do this, it is not because delidding/LM cannot achieve very significant decreases vs. solder (again, he admits it can be on the order of 10 degrees C), it's because:

    -delidding a 9900k properly, so that you get the desired results, while averting potential disaster such as completely trashing the expensive CPU, is something that most of of his enthusiast viewers will not have the skills/patience/confidence/risk-appetite to try. In fact, his own semi-comical ham-handedness in attempting it is one of his arguments for not recommending it to his viewers, as his efforts are likely to be pretty much representative of the efforts of his audience, lol.

    -in conjunction with this, he makes the argument from pragmatism: all Intel CPUs "work" out of the box. They may not have ideal or even desirable temps, but they usually won't blow up or melt down. The risk-plus-effort vs. reward equation, in other words, is in most cases not going to favor trying this at home.

    But as others such as der8auer have shown (you can see his Youtube analyses on this), trying to run the 9900k on a continuous basis at 5GHz is going to get you some really high temps, possibly to the point of throttling or shut down. Which begs the question: if you are not trying to achieve high clocks (and at the very least Intel's advertised speeds of 5GHz), why would someone even buy the 9900k in the first place?



    -I agree with GamersNexus' analysis. In fact, when the 9900k first came out, I was very leery of even considering buying it for a desktop, because it made no sense to buy the 9900k if I didn't want to achieve at least 5GHz clock speeds, and based on the reviews by der8auer and others, the prospect of delidding it myself at home made my hairs stand on end.

    -Fortunately for me, I don't have to do it myself, as professional delidding is now being offered for the 9900k at affordable prices. It really comes down to this: if someone was inclined to see the significant drops in temperatures from a well-done delidding as valuable before the 9900k, then one will continue to view the (still) significant temperature drops from a well-done delidding of the 9900k as valuable now. Conversely, if you didn't then, then you won't now. The only real difference is that most of us can't do it at home anymore, and have to get it done professionally.

    I certainly respect everyone's prerogative to make their own judgment for themselves; as stated previously, everyone's priorities and expectations are different, so it's a reasonable difference of opinion. But I thought it was important to state the facts clearly.
     
    hmscott likes this.
  32. Donald@Paladin44

    Donald@Paladin44 Retired

    Reputations:
    13,989
    Messages:
    9,257
    Likes Received:
    5,843
    Trophy Points:
    681
    The first shipment has arrived...but is almost sold out already.
     
  33. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Going by the GT73 and GT75, I will be totally shocked when this laptop is never advertised as having G-Sync, yet it eventually has G-Sync all the same.

    To this day you don't see a single mention of G-Sync in MSI materials for either of those machines, but they have it.
     
    hmscott and Stephen1892 like this.
  34. Falkentyne

    Falkentyne Notebook Prophet

    Reputations:
    8,396
    Messages:
    5,992
    Likes Received:
    8,633
    Trophy Points:
    681
    @JeanLegi @ryzeki Weren't all of the reviewed units all pre-release versions?
    I wonder if Optimus can be simply disabled in the unlocked bios?
     
    hmscott likes this.
  35. Stephen1892

    Stephen1892 Notebook Enthusiast

    Reputations:
    0
    Messages:
    16
    Likes Received:
    10
    Trophy Points:
    6
    I hope you're right as I'm wanting to upgrade my gt73vr 7rf to a gt76 and Gsync is Something I'm not wanting to give up
     
    hmscott likes this.
  36. robsternation

    robsternation Notebook Enthusiast

    Reputations:
    5
    Messages:
    13
    Likes Received:
    11
    Trophy Points:
    6
    I really hope that past is prologue in this particular case.
     
    hmscott likes this.
  37. JeanLegi

    JeanLegi Notebook Evangelist

    Reputations:
    308
    Messages:
    525
    Likes Received:
    424
    Trophy Points:
    76
    so far i don't know if optimus can be disabled in bios.
    i never find something else in the bios options of the gs63vr 6rf.
    i know only for notebookcheck that they use a prerelase-version of the GT76 but the MSI technician i talk to had a final GT76 in use during our conversation and he confirm that the GT76 has optimus and this means no g-sync.
     
    hmscott likes this.
  38. robsternation

    robsternation Notebook Enthusiast

    Reputations:
    5
    Messages:
    13
    Likes Received:
    11
    Trophy Points:
    6
    I just got a message from Donald at HIDe, and he is now telling me that he was mistaken, and that the GT76 does indeed have Optimus. Since this is the result of their inspection of their newly arrived retail stock, this is definitive.

    However, he is saying that one can select the RTX 2080 and lock it in for any application by going into the NVIDIA settings (I presume this would be in the Program Settings where you can pull up the full list of individual applications on your laptop).

    Since I have no experience with either Optimus or the RTX 2080, can anyone else confirm that this workaround is indeed the case?

    (Also, thanks for the heads-up on this issue.)
     
    hmscott likes this.
  39. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,525
    Messages:
    5,349
    Likes Received:
    4,333
    Trophy Points:
    431
    So these days there isnt a PEG or dGPU mode anymore eh?
     
    hmscott likes this.
  40. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Well it has Optimus so forget what I said. What was MSI thinking? Why drop the MUX setup. A DTR with Optimus is an oxymoron.

    This laptop is dead to me. If they EOL the GT75 I'll have to move to another brand when it's time to upgrade.
     
    Last edited: Jun 27, 2019
    hmscott likes this.
  41. Stephen1892

    Stephen1892 Notebook Enthusiast

    Reputations:
    0
    Messages:
    16
    Likes Received:
    10
    Trophy Points:
    6
    Im feeling the same. I was wanting to upgrade my gt73vr 7rf but now I'm looking at different brands which I never really considered. I've never owned a laptop with optimus before, maybe keeping the gt73 a few more months and see what's about is a better idea
     
    hmscott likes this.
  42. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Agreed, it's a shame that MSI tried too many "new" things in the GT76... MSI should have stuck with the MUX, dGPU, iGPU modes instead of Optimus.

    Optimus without a mux for physical display output shifting forces all video to the internal display to go through the iGPU which is co-resident on the CPU - which means the iGPU is a parasitic power and thermal resource stealing useless appendage that plops itself between the dGPU and the internal display.

    That means the iGPU controls the display properties, color management, and doesn't pass through or allow dGPU special features - like G-sync - from working with the internal display.

    Asus did something similar a long time ago, putting Optimus in their whole high end gaming laptops line - for one generation - users complained so much that Asus abandoned Optimus and the next generation were back to dedicated dGPU only.

    A giant laptop with a literal "ton" of resources and services sucking power isn't going to see an appreciable improvement in battery time with Optimus especially when the dGPU is powered on - tripping on due to App GPU selection settings.

    It makes far more sense to be dGPU only, and at worst provide a switchable display connection between the dGPU and iGPU so you can boot up under one or the other for longer battery time.

    Even with a switchable mux'd iGPU-only mode the increased battery time extension for me is only about 30-45 minutes - which for me is never useful enough - better to have a light 2in1 or other light long battery run time device in the bag along with the big laptop.

    The MSI GT75 8950HK + 2080 doesn't have Optimus... :)
     
    Last edited: Jun 28, 2019
    robsternation likes this.
  43. Stephen1892

    Stephen1892 Notebook Enthusiast

    Reputations:
    0
    Messages:
    16
    Likes Received:
    10
    Trophy Points:
    6
    Well hopefully they do some kind of revision of the gt76 or as you say the next gen laptop without the optimus.

    I have been looking at the GT75 Titan 9SG-295. I have until August really to decide what route I'll go down.
     
    hmscott likes this.
  44. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    I was quoted 2-3 months as the delay for the 9th generation GT75 by MSI USA.

    I don't think the 8c/16t power / thermal requirements can be satisfied even with the heavy duty GT75 as it is now, at least as far as the 6c/12t 8950HK GT75 performance I am seeing indicates.

    With a GT75 6c/12t 8950HK it needs to have the power limits disabled to get much past stock frequency performance.

    Power throttling before thermal throttling is what I am seeing, with the sweet spot around 4.7ghz/4.8ghz all core with -75mV - -100mV undervolt, and I still see occasional power / current limit throttling. The 8950HK will OC clock stable all core 5.1ghz but under load it power / current limits so hard it actually isn't worth setting the clock that high. Stock clocks were 4.3ghz with a taper down from there.

    Not a fan of unlocking power limits as that leads to re-pasting, re-padding, and many lost hours just to get a few more percentage points of performance. Probably needs to upgrade the power adapters from 2 x 230w to 2 x 330w as well.

    For it's intended use the 8950HK + 2080 is more than fast enough right now.

    What I am looking forward longer term are the Ryzen 3xxx / Navi high end gaming laptops to come. :)
     
    Last edited: Jun 28, 2019
  45. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Oh no... I hate to break it to you, but it looks like the 9th gen GT75 might have Optimus too...

    I googled your model number and the specifications list the UHD 630 Intel GPU!

    https://www.laptoparena.net/msi/msi-gaming-gt75(titan)9sg-295-16020

    GRAPHICS
    On-board graphics adapter Y
    On-board graphics adapter family Intel UHD Graphics
    NVIDIA G-SYNC Y
    CUDA Y
    Discrete graphics memory type GDDR6
    On-board graphics adapter DirectX version 12.0
    Discrete graphics adapter Y
    On-board graphics adapter model Intel UHD Graphics 630
    Discrete graphics adapter model NVIDIA GeForce RTX 2080
    On-board graphics adapter base frequency 350 MHz
    On-board graphics adapter dynamic frequency (max) 1250 MHz
    Maximum on-board graphics adapter memory 64 GB
    On-board graphics adapter OpenGL version 4.5

    Discrete graphics adapter memory 8 GB

    Can anyone confirm whether or not the new GT75 9th generation has moved to Optimus too?
     
    Last edited: Jun 28, 2019
  46. Stephen1892

    Stephen1892 Notebook Enthusiast

    Reputations:
    0
    Messages:
    16
    Likes Received:
    10
    Trophy Points:
    6
    Yeah but model I was looking at has Gsync and that's why I'm put off by the gt76. If I got a gt75 that has Gsync so I guess I could live with it. The new gt76 chasie really did appeal like but as it stands I wont be getting it because of lack of Gsync
     
    hmscott likes this.
  47. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    That's what is confusing, it does list G-sync for the dGPU, but is that on external video ports only or does it support G-sync on the internal display?

    NVIDIA G-SYNC Y

    It's odd that site's GT75(titan)9sg-295 specification listed so explicitly the UHD 630 iGPU as the "internal" GPU, that's usually reserved for Optimus descriptions, and I don't see any mention of MUX or Optimus, that's why I asked.
    https://www.laptoparena.net/msi/msi-gaming-gt75(titan)9sg-295-16020

    The 8th gen GT75 specifications page doesn't mention the Intel iGPU:
    https://us.msi.com/Laptop/GT75-Titan-8SX/Specification

    And, neither does the 9th gen GT75 specifications page mention the Intel GPU:
    https://us.msi.com/Laptop/GT75-Titan-9SX/Specification

    So maybe it's just that particular site's way of putting in the specs?

    The GT76 should also support G-sync on the external display ports.
     
    Last edited: Jun 28, 2019
  48. robsternation

    robsternation Notebook Enthusiast

    Reputations:
    5
    Messages:
    13
    Likes Received:
    11
    Trophy Points:
    6
    What happens if I go into the NVIDIA settings and manually select the dGPU (RTX 2080) for an individual application in the Program Settings? Will the iGPU still be "parasitic" and therefore degrade the graphic performance of the machine, even when the discrete GPU has been overtly selected for that application?
     
    hmscott likes this.
  49. Semmy

    Semmy Notebook Consultant

    Reputations:
    38
    Messages:
    210
    Likes Received:
    140
    Trophy Points:
    56
    Video review GT76

     
    joskei and hmscott like this.
  50. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Here's a quote from Nvidia's current docs on "Introduction to Optimus" for developers:

    "Using NVIDIA’s Optimus technology, when the discrete GPU is handling all the rendering duties, the final image output to the display is still handled by the Intel integrated graphics processor (IGP). In effect, the IGP is only being used as a simple display controller, resulting in a seamless, flicker-free experience with no need to reboot."
    https://docs.nvidia.com/cuda/optimus-developer-guide/index.html

    Optimus sux because you can't turn it off. When Optimus is designed into the laptop, it's because the iGPU is wired to the internal display (and sometimes to one of the external display ports) and with the iGPU being the active controller for the internal display it heats up and reduces the thermal headroom of the *CPU*.

    That's right, with Optimus the CPU performance and thermals are hit not the dGPU performance.

    When this happened with Asus with a generation of G750's, the previous generation CPU was the same - Haswell - and the models with Optimus were running 10c hotter than the non-Optimus models.

    That's one of the times undervolting became so popular as it helped drop the 100% load temps 10c and brought the CPU's back under thermal throttling.

    I didn't need to undervolt the previous generation G750 with the same CPU because the iGPU wasn't wired up or powered on.

    Using Nvidia / Windows to set Optimus iGPU / dGPU per application can force dGPU to be used, but the iGPU is still powered on and passing the video through from the dGPU.

    Some applications don't respect that forced dGPU setting and will still use the Intel iGPU - Windows 8.1 and before could only utilize the iGPU for rendering, I'm not sure about Windows 10 - it might use the dGPU - but I doubt it.

    This is why people were so disappointed with the new G750's they bought as upgrades only to find out that when browsing with IE or using Windows itself that their CPU would heat up and kick on the fans whereas before without the iGPU powered up they could browse the internet and watch video rendered from the dGPU only and the CPU fan would stay silent.

    I'll see if I can dig up the info on whether Windows 10 is finally using the dGPU - or you can try it if you have Optimus, try to right click on IE and force it to use the dGPU and then use the Optimus monitoring / state tool to see if IE switches to using the dGPU away from being locked into using the iGPU.

    Now, there are a lot of people that say they use Optimus without issues, but days, weeks, or months later they hit one of those Optimus gotcha situations - so be aware it might be ok for you at the start but as you install more software you may find issues where the app won't switch to the dGPU and you are stuck with iGPU only operation for those tools.

    Edit: I think this guy was speaking about Windows 10, I'll see if I can find a more definitive mention:

    Joshua Martin Reputable, May 3, 2015
    "No, it will NOT use the Nvidia card for basic Windows. Keep it on "High performance processor" so it'll auto switch the the Nvidia card when launching appropriate applications."
    https://forums.tomshardware.com/threads/how-to-set-dedicated-gpu-as-default.2271030/post-15111074

    Here's the original Optimus Whitepaper, and at the end it describes how internet browsing is done on the iGPU:
    https://www.nvidia.com/object/LO_optimus_whitepapers.html
    Found here: https://www.nvidia.com/object/optimus_technology.html

    "Scenario 3 – Web Browsing: When simply browsing the Web and checking email, the processing power of the GPU is largely unutilized. However, as Flash-based content continues to grow there are situations like watching streaming HD content where Optimus can harness the GPU‟s power to provide the best possible experience.

    Experience on IGP: Running on integrated graphics, general Web browsing will perform as expected. This will provide the highest possible battery life without sacrificing the browsing experience.

    Experience on GPU: Although the GPU will provide the same experience as the integrated graphics when doing general Web browsing, the GPU will consume more power resulting in lower overall battery life.

    How Optimus Reacts: Optimus detects that there are no intensive applications launching and keeps the GPU powered down. In doing so, the system is able to observe the highest possible battery life without any sacrifice of performance or the overall user experience. However, if the user is streaming Flash video using a website like YouTube or Hulu, Optimus will recognize the performance and quality benefits the GPU provides to Adobe Flash 10.1 (especially HD and high quality content) and will enable the discrete GPU."

    Optimus was designed to use the iGPU as much as possible and only call the dGPU in for heavy duty rendering.

    Microsoft Edge may be able to take advantage of the dGPU - stepping outside the Windows Desktop paradigm may have freed itself from being locked in to iGPU only primary use. But, everything else running as part of Windows desktop environment would still be iGPU only in Windows 10 unless redone to support direct dGPU.

    Again, you can check it out by trying to force the Edge and IE apps to use dGPU and see if it sticks to dGPU only usage.
     
    Last edited: Jun 28, 2019
    robsternation likes this.
 Next page →