The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    *** The Official MSI GT75 Owners and Discussions Lounge ***

    Discussion in 'MSI Reviews & Owners' Lounges' started by Spartan@HIDevolution, Jun 23, 2017.

  1. Matjaz Pecavar

    Matjaz Pecavar Notebook Enthusiast

    Reputations:
    0
    Messages:
    14
    Likes Received:
    5
    Trophy Points:
    6
    Tenks, now I dont mind if coolers work hard :) and noisy :p should i reather look at 1080 and have a 7gen proc or 1070 and 8gen :)...
    moast of the time I will run android emulgator on pc and one game on it. On free time like some weekends I wanna play BF4 again :p
     
  2. Pedro69

    Pedro69 Notebook Evangelist

    Reputations:
    84
    Messages:
    572
    Likes Received:
    221
    Trophy Points:
    56
    GTX1070 vs GTX1080 will be 20-25% more performance...unless you not going do oc you should use 8750H...
     
  3. Matjaz Pecavar

    Matjaz Pecavar Notebook Enthusiast

    Reputations:
    0
    Messages:
    14
    Likes Received:
    5
    Trophy Points:
    6
    This ain't looking too hard :) I had worse cases :F

     
  4. Falkentyne

    Falkentyne Notebook Prophet

    Reputations:
    8,396
    Messages:
    5,992
    Likes Received:
    8,633
    Trophy Points:
    681
    This video just gave me stomach cancer.
    HE FREAKING DISASSEMBLED THE FIRST HALF OF THE LAPTOP WITH THE BATTERY CABLE CONNECTED!! What in God's name was he thinking ????????????????????
     
  5. Matjaz Pecavar

    Matjaz Pecavar Notebook Enthusiast

    Reputations:
    0
    Messages:
    14
    Likes Received:
    5
    Trophy Points:
    6
    I saw that too... Weird procedure. But the rest is quite easy
     
    hmscott likes this.
  6. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    It's actually much more common than you would believe.

    There are many video's out there where the power cable is connected, and some even "bump the power button" and it turns on while spread wide open.

    And, some will still argue against that as being a problem. While at the same time complaining that their laptop doesn't power up any longer.

    It's amazing how the blithe ignorance of newbies can carry them through without even realizing just how lucky they've been to successfully complete a tear down and get it working again.

    That's why I so fervently recommend not opening and re-pasting; I've seen too many ruin perfectly good laptops for no real reason of performance advantage. Most of the time it's for posing, vanity reasons.

    Why take the chance? Be happy with what you have and tune it with software, you'll have more fun gaming anyway. :)
     
    Last edited: Jun 3, 2018
    Pedro69 and Falkentyne like this.
  7. Pedro69

    Pedro69 Notebook Evangelist

    Reputations:
    84
    Messages:
    572
    Likes Received:
    221
    Trophy Points:
    56
    I used this youtube channel to check the disassembly of my GT72VR....he also have for GT75,

     
    Last edited: Jun 8, 2018
  8. JeanLegi

    JeanLegi Notebook Evangelist

    Reputations:
    308
    Messages:
    525
    Likes Received:
    424
    Trophy Points:
    76
    Maybe he is Chuck Norris or John Hannibal Smith...
    The-A-Team.jpg

     
    Last edited: Jun 4, 2018
    Pedro69 and hmscott like this.
  9. prodj

    prodj Notebook Consultant

    Reputations:
    15
    Messages:
    113
    Likes Received:
    25
    Trophy Points:
    41
    Hey, guys, just bought my GT75 Titan 8RG and I can't see integrated GPU anywhere. Is it normal? Is it disabled in bios or something?
     
  10. heliada

    heliada Notebook Evangelist

    Reputations:
    259
    Messages:
    593
    Likes Received:
    513
    Trophy Points:
    106
    Yeah you are stuck with the Nvidia one. Hope you didn't need the Intel card to run some specific programs.
     
  11. Pedro69

    Pedro69 Notebook Evangelist

    Reputations:
    84
    Messages:
    572
    Likes Received:
    221
    Trophy Points:
    56
    Perhaps @Paloseco can help you with that but you need unlock your bios...
     
  12. prodj

    prodj Notebook Consultant

    Reputations:
    15
    Messages:
    113
    Likes Received:
    25
    Trophy Points:
    41
    I don't really need it, It's just unexpected. I thought all GT laptops are switching between GPUs.
    Why they discontinued that option?
     
  13. zipperi

    zipperi Notebook Deity

    Reputations:
    53
    Messages:
    741
    Likes Received:
    190
    Trophy Points:
    56
    Pretty useless - I haven't even swiched mine once, in three months on GT73VR. I don't need that small improvement in battery time.
     
  14. Pedro69

    Pedro69 Notebook Evangelist

    Reputations:
    84
    Messages:
    572
    Likes Received:
    221
    Trophy Points:
    56
    But why you need the iGPU?
     
  15. prodj

    prodj Notebook Consultant

    Reputations:
    15
    Messages:
    113
    Likes Received:
    25
    Trophy Points:
    41
    How to flash vBios?
    What will happen if I disable GTX 1080 in Device Manager?
     
  16. prodj

    prodj Notebook Consultant

    Reputations:
    15
    Messages:
    113
    Likes Received:
    25
    Trophy Points:
    41
    Ok, it's just going to one monitor in low resolution. Sucks, but ok, whatever.
     
  17. Kevin@GenTechPC

    Kevin@GenTechPC Company Representative

    Reputations:
    1,014
    Messages:
    8,500
    Likes Received:
    2,098
    Trophy Points:
    331
    It's better this way without Optimus as that one extra hop on the circuitry can cause micro-stutter in some cases.
     
    hmscott likes this.
  18. prodj

    prodj Notebook Consultant

    Reputations:
    15
    Messages:
    113
    Likes Received:
    25
    Trophy Points:
    41
    Sure, if there is a 0,01% chance that it can cause a problem or if it decreases performance by 0,01% - it's not worth it in Gaming laptop.

    Also: it doesn't work with 980m in my GT60 anyway.
     
    Kevin@GenTechPC likes this.
  19. heliada

    heliada Notebook Evangelist

    Reputations:
    259
    Messages:
    593
    Likes Received:
    513
    Trophy Points:
    106
    Some GT series have iGPU completely disabled (the gt72vr series had that, now the gt75) and some have hardware button for switching which requires a reboot to switch (gt73vr, gt75vr I think too). Only the lower series such as gs, gl etc come with optimus where both gpu's are enabled at the same time and external gpu only activates when needed to power the applications that are set to be using it. I believe optimus laptops are not compatible with gsync.
    I don't understand why it causes stutters these days, I had optimus running on my old asus with nvidia gt 635m and never noticed any problems with it at all... then again that was on extremely old drivers (340 or something along those lines) and windows 10 versions below creators update. Sad that it does not work on the newer laptops as expected.
     
    hmscott likes this.
  20. Spartan@HIDevolution

    Spartan@HIDevolution Company Representative

    Reputations:
    39,604
    Messages:
    23,562
    Likes Received:
    36,865
    Trophy Points:
    931
    Switching between GPUs like we were able to in the previous GT73VR Titan Pro is not optimus. They Graphics card don't work simultaneously hence, it's not optimus. You can run one or the other for those times when you want more battery life you can press a button on the laptop or via the BIOS and change to the Dedicated/Integrated GPU which does not cause any stuttering or issues.

    Do I miss this being removed from my GT75 Titan? Not one bit, never used it personally as I am always plugged in but I understand some people may want to use it to squeeze more battery life out of their taptops.
     
    hmscott and Kevin@GenTechPC like this.
  21. Kevin@GenTechPC

    Kevin@GenTechPC Company Representative

    Reputations:
    1,014
    Messages:
    8,500
    Likes Received:
    2,098
    Trophy Points:
    331
    Nvidia dGPU has power saving mode while it's on battery but it still consumes more power than iGPU because it's 2 components versus 1. It's harder to satisfy both at the same time but since it's gaming system we like those raw power.
     
    hmscott, Pedro69 and prodj like this.
  22. prodj

    prodj Notebook Consultant

    Reputations:
    15
    Messages:
    113
    Likes Received:
    25
    Trophy Points:
    41
    I just found out that this massive gaming laptop has soldered CPU.....
    Why?? I really didn't expect that, because even my GT60 has PGA...

    Well.. I couldn't find LGA Clevo in country I currently live in anyway...

    Maybe I should return laptop and just go for desktop...

    Is it really bad? If no way to upgrade, what to do after 3 years? Just trash it? Smh..
     
    Last edited by a moderator: Jun 9, 2018
  23. zipperi

    zipperi Notebook Deity

    Reputations:
    53
    Messages:
    741
    Likes Received:
    190
    Trophy Points:
    56
    So? Your 8750H is about 50% faster than 3630QM - that's not too much development in 6 years. Other factors do prompt an upgrade - faster interfaces and Graphics for instance. I'm still pleased with my 3630QM performance on my backup laptop. You want desktop processors - get a Sager/Clevo clone.
     
    Kevin@GenTechPC likes this.
  24. JeanLegi

    JeanLegi Notebook Evangelist

    Reputations:
    308
    Messages:
    525
    Likes Received:
    424
    Trophy Points:
    76
    22nm vs 14nm++?
    6 cores with 45 TDP instead of 4?

    Don't think that we would have a 6 core mobile CPU in 22nm for taptops and if yes i think they would be much thicker than the GTXx Series taptops.
     
  25. Kevin@GenTechPC

    Kevin@GenTechPC Company Representative

    Reputations:
    1,014
    Messages:
    8,500
    Likes Received:
    2,098
    Trophy Points:
    331
    I concur with zipperi. We have many enthusiasts on the forum who would love to have PGA but with current market situation, LGA is pretty much meant for desktop enthusiasts while BGA is mainly meant to notebook systems because not many users upgrade the processors nowadays. Besides, each year a new platform/generation comes out which requires more pins, newer chipset, costs, etc, which made CPU/GPU upgrade less appeal.
    There's MSI 16L13 to go with if you must have to have LGA, see below for review. You can also check with Eurocom for the Coffee Lake offering, specifically the F7 model.
    http://www.notebookreview.com/notebookreview/eurocom-tornado-f5-msi-16l13-review
     
    Last edited: Jun 9, 2018
  26. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,712
    Messages:
    29,847
    Likes Received:
    59,652
    Trophy Points:
    931
    No more Pin Grid Array (PGA) sir. Land Grid Array (LGA).
     
    Pedro69 and Kevin@GenTechPC like this.
  27. Kevin@GenTechPC

    Kevin@GenTechPC Company Representative

    Reputations:
    1,014
    Messages:
    8,500
    Likes Received:
    2,098
    Trophy Points:
    331
    Papusan likes this.
  28. prodj

    prodj Notebook Consultant

    Reputations:
    15
    Messages:
    113
    Likes Received:
    25
    Trophy Points:
    41
    ok, I see what is happening here...

    Another question:
    External monitor connected to HDMI. Laptop turns off display after some time of inactivity, when I move the mouse or something - it turns on only laptop's display, but not external. What's up with that and how to fix this?
     
  29. prodj

    prodj Notebook Consultant

    Reputations:
    15
    Messages:
    113
    Likes Received:
    25
    Trophy Points:
    41
    Maybe it's monitor's problem, HDMI 1 works ok (but its 30hz 4k), HDMI 2 doesn't turn ON display in this case, even though power button light is changing.
    edited
     
    Last edited: Jun 10, 2018
  30. JeanLegi

    JeanLegi Notebook Evangelist

    Reputations:
    308
    Messages:
    525
    Likes Received:
    424
    Trophy Points:
    76
    Hard to say without knowing which kind of external display you use....
    in this case i must guess and would say your hdmi 2 port could be an MHL HDMI Port and this Port is normally for mobile phones and tablets to connect them with your display.
     
  31. prodj

    prodj Notebook Consultant

    Reputations:
    15
    Messages:
    113
    Likes Received:
    25
    Trophy Points:
    41
    HDMI-1 is low speed and shows only 30hz 4k. Switches On perfectly, but I cant use 30hz.))

    HDMI-2 is good speed.
    It switches On perfectly on old laptop which sends 1080p signal.
    But it doesn't switch on with this laptop which sends 4k 60hz signal. I have to power off and On manually or reinsert HDMI cable.

    Is it possible to send 1080p signal from this laptop? Because if I change resolution - it just scales it, but sending 4k 60hz anyway.
     
  32. prodj

    prodj Notebook Consultant

    Reputations:
    15
    Messages:
    113
    Likes Received:
    25
    Trophy Points:
    41
    So, I made 1440p in Nvidia panel and now it switches On without problem.
    I will try 4k through DisplayPort when my cable miniDP-DP will arrive.
     
    Last edited: Jun 10, 2018
  33. prodj

    prodj Notebook Consultant

    Reputations:
    15
    Messages:
    113
    Likes Received:
    25
    Trophy Points:
    41
    Anybody has 4k external monitor? To make sure if it's laptop or monitor problem. Looks like monitor. (I don't think it's cable)
     
    Last edited: Jun 10, 2018
  34. JeanLegi

    JeanLegi Notebook Evangelist

    Reputations:
    308
    Messages:
    525
    Likes Received:
    424
    Trophy Points:
    76
    Which version supports the HDMI cable?
    2.1, 2.0x, 1.3 or less, 1.3 or 1.4?
    If 2.1 you have no 4k60hz support.

    For more I formation you can use http://www.giyf.com
     
  35. prodj

    prodj Notebook Consultant

    Reputations:
    15
    Messages:
    113
    Likes Received:
    25
    Trophy Points:
    41
    I said that I have 4k60hz working, it's just with this resolution monitor doesn't switch On by itself when I move a mouse or something.
    With lower resolutions like 1440p60hz or 4k30hz - it does.
    I just wonder if somebody else experienced that or can test.
     
  36. prodj

    prodj Notebook Consultant

    Reputations:
    15
    Messages:
    113
    Likes Received:
    25
    Trophy Points:
    41
    DisplayPort works ok (just tested now).
    Now I need new HDMI cable to check if it's cable problem, but I don't really imagine how cable would cause that.
     
  37. Kevin@GenTechPC

    Kevin@GenTechPC Company Representative

    Reputations:
    1,014
    Messages:
    8,500
    Likes Received:
    2,098
    Trophy Points:
    331
    Quality of cable matters in some cases. Use a higher-rating cable to acquire necessary bandwidth to drive that resolution.
     
  38. Ghost 350

    Ghost 350 Notebook Consultant

    Reputations:
    16
    Messages:
    104
    Likes Received:
    61
    Trophy Points:
    41

    How did you get all 3 960 Pros to work in RAID0 when only 2 slots support NVME and the other is just sata? Wouldn't the speeds be affected?
     
  39. JeanLegi

    JeanLegi Notebook Evangelist

    Reputations:
    308
    Messages:
    525
    Likes Received:
    424
    Trophy Points:
    76
    The new GT75 has 2 combo slots and one pure NVMe slot.
     
    Donald@Paladin44 likes this.
  40. Spartan@HIDevolution

    Spartan@HIDevolution Company Representative

    Reputations:
    39,604
    Messages:
    23,562
    Likes Received:
    36,865
    Trophy Points:
    931
    They are all NVMe on the new GT75
     
    Donald@Paladin44 likes this.
  41. Alatar

    Alatar Notebook Guru

    Reputations:
    16
    Messages:
    51
    Likes Received:
    13
    Trophy Points:
    16
    I finally got my GT75 Titan 8RG in (from HIDEvolution)! Original had a bad pixel so had to send it back lol. Running beautifully now. Just had a question if anyone might be able to answer it:

    So running benchmarks/games my Kill-A-Watt is showing max of like 275 watt (at stock clocks). Is it possible to buy a single 330w power brick for traveling purposes while my dual 230w setup stays at home? Is it even a good idea to do that or is 275w too close to the 330w output on the supply?

    If possible/not inadvisable, where would I go about ordering such a thing? Is the power supply for the GT73 the same thing (if so then HIDEvolution has those on their site)?
     
    Last edited: Jun 13, 2018
  42. Falkentyne

    Falkentyne Notebook Prophet

    Reputations:
    8,396
    Messages:
    5,992
    Likes Received:
    8,633
    Trophy Points:
    681
    You *can*, but I can't help you with this. I have NO idea how the EC works or how it functions with power supplies of a lower capacity.
    The Eurocom 780W PSU has been tested on your model without the EC Throttling the CPU at heavy GPU+CPU load, so that has been proven to work successfully. The 330W single PSU has not been tested, so it would be "spend your money and try it out", as I am no longer able or capable of helping users with your systems as I do not have access to one. I know how the older MSI laptops worked and how you could change 'EC RAM' registers in RW Everything and use higher wattage power supplies to bypass "EC" based power restrictions (Example: TDP modding a GTX 1070 to 200W to match the GTX 1080 TDP, buying a 330W PSU, and then changing a value in RW Everything (EC) to change the power ID to the GTX 1080 power ID (failure to do this would cause extreme CPU power throttling when 230W power is exceeded, as well as EXTREME battery leech drain), but again, NO ONE has tested what will happen with a GT75 Titan and a single 330W.

    If you do decide to experiment for the benefit of the rest of us, please run Throttlestop while gaming on the 330W, make sure your combined system power load does NOT ever exceed 370W from the wall directly (going past this will trip the safety circuit of the PSU; the PSU's are always capable of drawing more total power than their rating), and in Throttlestop, pay VERY CLOSE attention to the "Limit" checkbox, and check for CPU "Power Limit" throttling, e.g. power limit 1/PL2 TDP to 45W.

    On the GT73VR SLI system (2x 230W + 2x GTX 1070), unplugging one of the two power supplies and using only one power brick causes extreme CPU throttling (as if somehow the system is able to detect that only 1 PSU is plugged in and then throttles the system at 230W of total AC Power). Usually the 2x230W PSU is rated for 460W of total power. Yet somehow it manages to detect that only one power brick is active. How, I don't know. You would think that having one 230W PSU plugged into the dongle would simply overdraw the PSU (past 230W) and cause the PSU to power down, but instead the CPU just throttles. ( @sirgeorge tested that). That's because, the "master" power ID value for a GTX 1070 system, is "90", whether it's single or SLI, it's still 90 (which means 230W). Something causes this 230W to be doubled if both power bricks are in use.

    For example, using a 230W PSU, a TDP modded GTX 1070 (modded to 200W) and changing the master power ID to 330W (ID=91) on a GT73VR, allows the system to try to draw 330W. The 230W PSU will shut off after 250W by the system (>280W from the wall).

    Sorry if this confuses you. tl;dr: try it and check "Throttlestop Limit Reasons" button for bizarre CPU throttling if you do.
     
    JeanLegi likes this.
  43. Alatar

    Alatar Notebook Guru

    Reputations:
    16
    Messages:
    51
    Likes Received:
    13
    Trophy Points:
    16
    Thanks for the info. I'll see how annoying it is to use the dual supply in the car tomorrow and consider whether I want to make the purchase at that time. I'll let you all know how it works if I decide to pick one up.
     
  44. an0n

    an0n Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5
    Hello all, I have several questions. I am looking to buy a N173HHE-G32 panel for a replacement in a different laptop. I need a screen with the lowest response times possible. Notebookcheck has reviewed many laptops with this panel, and nearly all of them had 25ms+ GTG/10ms+ BTW response times

    However, the only laptops with this panel that have a decent response time (less than 10ms GTG/6ms) are the GT75 laptops.

    I have learned that there are variations in the revisions of the panels, and that the Rev. C2 is a "5ms" panel, and that Rev. C3 is a "3ms" panel (according to MSI, as stated by an MSI employee). Knowing this, I thought that the difference was that the GT75 had C3s and the rest had older C2 panels.

    HOWEVER, I found an outlier. The MSI GE73 8RF Raider RGB has a 3ms (as stated by MSI) N173HHE-G32 panel.

    According to notebookcheck, the laptop screen has a GTG response time of 28ms and a BTW response time of 13.4ms, similar to the all the rest of the N173HHE panels. All the other stats (color, etc.) are nearly identical between the titan panels and this one's.

    I wish to get a panel with response times less than 10ms.

    My questions are:
    Do the GT75 models come shipped with a Rev. C3 panel?
    And, do the GT75 panels actually have a 10ms GTG response time as reported by notebookcheck?
    Why would the panel in the GE73 8RF have a twice as high response time if it uses the same "3ms" N173HHE-G32 panel?



    Thank you for reading, I hope someone with some technical knowledge can chime in.
     
  45. raz8020

    raz8020 Notebook Consultant

    Reputations:
    520
    Messages:
    225
    Likes Received:
    303
    Trophy Points:
    76
    Yes, there are multiple revisions for the n173hhe-g32 panel and there are differences in specs and resp. times between revisions. There are slight differences (regardless of the panel brand or manufacturer) in specs even for the same panel with the same revision.

    Depending on the shade, some transitions will be faster, some will be slower.

    The reason for that difference in response times, is due to the fact that the manufacturers advertise the fastest speed for a specific transition (they could have cherry picked the time for the fastest transition from all the tested transitions, or it could be that they advertised the BtW time), while NBC provides the rise and fall transition times for BtW and for 50% grey to 80% grey.

    For. eg: you stated that, the 5ms resp time (the transition for that resp time isn't provided, but it would seem to correspond to BtW) corresponds to the C2 rev, but in NBC, that panel had 4ms rise for BtW and 3ms fall WtB, while the GtG (50% to 80%) values were 12ms/14ms.

    https://www.msi.com/blog/Introduction-to-MSIs-120Hz-5ms-3ms-gaming-panel

     
    Donald@Paladin44 likes this.
  46. Fungus99

    Fungus99 Notebook Consultant

    Reputations:
    4
    Messages:
    265
    Likes Received:
    19
    Trophy Points:
    31
    Is the GPU upgradeable in the GT75 TITAN-057?
    Since the screen is TN, how bad is the viewing angle?
    I've heard the gt63 has bad built quality. Is the GT75 much of an improvement, anybody flex?
    Why isn't there an option for the i9 CPU with gtx1070 graphics since the CPU is soldered on and can't be upgraded later ?n.
    Does thCPUpu throttle at all at lax clock speed running 5 plus hours?
    I originally intended to go for a slim gaming laptop like the asus Zephyrus m or aero 15x but after hearing the CPU throttle, I'm going to go with a big and bulky system.
     
  47. an0n

    an0n Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5

    So, do you know what panel I would have to buy to get the reported response times of the GT75 laptops?

    Is it a N173HHE G32 Rev. C3?
     
  48. raz8020

    raz8020 Notebook Consultant

    Reputations:
    520
    Messages:
    225
    Likes Received:
    303
    Trophy Points:
    76
    I think that only msi would know the answer tot that question, but in theory yes, the latest rev should be the one with better response times.
     
  49. Fungus99

    Fungus99 Notebook Consultant

    Reputations:
    4
    Messages:
    265
    Likes Received:
    19
    Trophy Points:
    31
    Would I be silly to even consider buying one with the single gtx1070 and i7 model due to budget restraints when there are other alternatives out there with the exact same specs but in a slim portable form factor like the Asus Zephyrusm aero 15x, MSI stealth or is this the only 8th gen laptop that actually that has enough cooling to adequately run the new 6 core CPU without throttling and for this reason is worth it?

    Does the base i7 generate significantly less heat than the i9 CPU?
    For someone doesn't do any overclocking, would the i7 be overkill for my needs?
    I've heard even the cooling on the system still isn't enough with the i9. Is this true?
     
  50. Falkentyne

    Falkentyne Notebook Prophet

    Reputations:
    8,396
    Messages:
    5,992
    Likes Received:
    8,633
    Trophy Points:
    681
    The GT75 series use the exact same heatsink. Only the CPU (or video card) is different. If the Bios and EC firmware is the same between the i7 and i9 version of this model, then the mainboard and heatsink MUST be the same, with only the PSU and GPU differing (GTX 1080 will use a different heatsink than GTX 1070 of course).

    The i7 and i9 chips are exactly the same, besides i7 being more locked down and a lower quality silicon bin tier. Heat produced is a function of voltage and amps, so an i7 and i9 6 core CPUs running at 3.4 ghz @ 1.1v will both produce exactly the same heat, if of course the default VID is identical for the same speed preset. The difference is, the i9 is fully unlocked while the i7 isn't, so the i9 will thus potentially be able to clock higher (and thus run hotter as a result).

    The silly and confusing thing about this i7 vs i9 drama is that the i9 + GTX 1080 uses 2x230W PSU's (the same configuration that the old GT73VR 1070 SLI 4 core system used), while the i7 uses a single 330W PSU. And there is no i7 GTX 1080 configuration (as far as I know), and there is also no i9 + GTX 1070 configuration.

    I think MSI's hardware team is overdosing on marijuana, because they could have easily produced a i9 + GTX 1070 configuration and bundled it with a single 330W power supply easily.

    Perhaps they thought that customers would think a 115W GPU would be useless when paired with an unlocked 6 core CPU, and that the 115W GPU would be best paired with the lower tier locked/partially locked down chips, especially this far into the GTX 1080/Pascal's lifespan. And TDP modding the 6 core version of the GPU's on these systems is currently impossible.

    To be honest, a 115W GPU paired with a 6 core fully unlocked CPU looks sort of stone age, whether it's a 8700K or 8950HK.
     
    raz8020 likes this.
← Previous pageNext page →