Tenks, now I dont mind if coolers work hardand noisy
should i reather look at 1080 and have a 7gen proc or 1070 and 8gen
...
moast of the time I will run android emulgator on pc and one game on it. On free time like some weekends I wanna play BF4 again![]()
-
-
-
This ain't looking too hard
I had worse cases :F
-
Falkentyne Notebook Prophet
HE FREAKING DISASSEMBLED THE FIRST HALF OF THE LAPTOP WITH THE BATTERY CABLE CONNECTED!! What in God's name was he thinking ????????????????????Ghost 350, ThePerfectStorm and hmscott like this. -
hmscott likes this.
-
There are many video's out there where the power cable is connected, and some even "bump the power button" and it turns on while spread wide open.
And, some will still argue against that as being a problem. While at the same time complaining that their laptop doesn't power up any longer.
It's amazing how the blithe ignorance of newbies can carry them through without even realizing just how lucky they've been to successfully complete a tear down and get it working again.
That's why I so fervently recommend not opening and re-pasting; I've seen too many ruin perfectly good laptops for no real reason of performance advantage. Most of the time it's for posing, vanity reasons.
Why take the chance? Be happy with what you have and tune it with software, you'll have more fun gaming anyway.Last edited: Jun 3, 2018Pedro69 and Falkentyne like this. -
I used this youtube channel to check the disassembly of my GT72VR....he also have for GT75,
Last edited: Jun 8, 2018 -
Last edited: Jun 4, 2018 -
Hey, guys, just bought my GT75 Titan 8RG and I can't see integrated GPU anywhere. Is it normal? Is it disabled in bios or something?
-
-
-
I don't really need it, It's just unexpected. I thought all GT laptops are switching between GPUs.
Why they discontinued that option? -
Pretty useless - I haven't even swiched mine once, in three months on GT73VR. I don't need that small improvement in battery time.
-
-
How to flash vBios?
What will happen if I disable GTX 1080 in Device Manager? -
Ok, it's just going to one monitor in low resolution. Sucks, but ok, whatever.
-
Kevin@GenTechPC Company Representative
It's better this way without Optimus as that one extra hop on the circuitry can cause micro-stutter in some cases.
hmscott likes this. -
Also: it doesn't work with 980m in my GT60 anyway.Kevin@GenTechPC likes this. -
I don't understand why it causes stutters these days, I had optimus running on my old asus with nvidia gt 635m and never noticed any problems with it at all... then again that was on extremely old drivers (340 or something along those lines) and windows 10 versions below creators update. Sad that it does not work on the newer laptops as expected.hmscott likes this. -
Spartan@HIDevolution Company Representative
Do I miss this being removed from my GT75 Titan? Not one bit, never used it personally as I am always plugged in but I understand some people may want to use it to squeeze more battery life out of their taptops.hmscott and Kevin@GenTechPC like this. -
Kevin@GenTechPC Company Representative
-
I just found out that this massive gaming laptop has soldered CPU.....
Why?? I really didn't expect that, because even my GT60 has PGA...
Well.. I couldn't find LGA Clevo in country I currently live in anyway...
Maybe I should return laptop and just go for desktop...
Is it really bad? If no way to upgrade, what to do after 3 years? Just trash it? Smh..Last edited by a moderator: Jun 9, 2018 -
So? Your 8750H is about 50% faster than 3630QM - that's not too much development in 6 years. Other factors do prompt an upgrade - faster interfaces and Graphics for instance. I'm still pleased with my 3630QM performance on my backup laptop. You want desktop processors - get a Sager/Clevo clone.
Kevin@GenTechPC likes this. -
6 cores with 45 TDP instead of 4?
Don't think that we would have a 6 core mobile CPU in 22nm for taptops and if yes i think they would be much thicker than the GTXx Series taptops. -
Kevin@GenTechPC Company Representative
There's MSI 16L13 to go with if you must have to have LGA, see below for review. You can also check with Eurocom for the Coffee Lake offering, specifically the F7 model.
http://www.notebookreview.com/notebookreview/eurocom-tornado-f5-msi-16l13-reviewLast edited: Jun 9, 2018 -
Pedro69 and Kevin@GenTechPC like this.
-
Kevin@GenTechPC Company Representative
-
ok, I see what is happening here...
Another question:
External monitor connected to HDMI. Laptop turns off display after some time of inactivity, when I move the mouse or something - it turns on only laptop's display, but not external. What's up with that and how to fix this? -
Maybe it's monitor's problem, HDMI 1 works ok (but its 30hz 4k), HDMI 2 doesn't turn ON display in this case, even though power button light is changing.
editedLast edited: Jun 10, 2018 -
Hard to say without knowing which kind of external display you use....
in this case i must guess and would say your hdmi 2 port could be an MHL HDMI Port and this Port is normally for mobile phones and tablets to connect them with your display. -
HDMI-1 is low speed and shows only 30hz 4k. Switches On perfectly, but I cant use 30hz.))
HDMI-2 is good speed.
It switches On perfectly on old laptop which sends 1080p signal.
But it doesn't switch on with this laptop which sends 4k 60hz signal. I have to power off and On manually or reinsert HDMI cable.
Is it possible to send 1080p signal from this laptop? Because if I change resolution - it just scales it, but sending 4k 60hz anyway. -
So, I made 1440p in Nvidia panel and now it switches On without problem.
I will try 4k through DisplayPort when my cable miniDP-DP will arrive.Last edited: Jun 10, 2018 -
Anybody has 4k external monitor? To make sure if it's laptop or monitor problem. Looks like monitor. (I don't think it's cable)
Last edited: Jun 10, 2018 -
Which version supports the HDMI cable?
2.1, 2.0x, 1.3 or less, 1.3 or 1.4?
If 2.1 you have no 4k60hz support.
For more I formation you can use http://www.giyf.com -
With lower resolutions like 1440p60hz or 4k30hz - it does.
I just wonder if somebody else experienced that or can test. -
DisplayPort works ok (just tested now).
Now I need new HDMI cable to check if it's cable problem, but I don't really imagine how cable would cause that. -
Kevin@GenTechPC Company Representative
JeanLegi, Spartan@HIDevolution and Pedro69 like this. -
How did you get all 3 960 Pros to work in RAID0 when only 2 slots support NVME and the other is just sata? Wouldn't the speeds be affected? -
The new GT75 has 2 combo slots and one pure NVMe slot.
Donald@Paladin44 likes this. -
Spartan@HIDevolution Company Representative
Donald@Paladin44 likes this. -
I finally got my GT75 Titan 8RG in (from HIDEvolution)! Original had a bad pixel so had to send it back lol. Running beautifully now. Just had a question if anyone might be able to answer it:
So running benchmarks/games my Kill-A-Watt is showing max of like 275 watt (at stock clocks). Is it possible to buy a single 330w power brick for traveling purposes while my dual 230w setup stays at home? Is it even a good idea to do that or is 275w too close to the 330w output on the supply?
If possible/not inadvisable, where would I go about ordering such a thing? Is the power supply for the GT73 the same thing (if so then HIDEvolution has those on their site)?Last edited: Jun 13, 2018 -
Falkentyne Notebook Prophet
You *can*, but I can't help you with this. I have NO idea how the EC works or how it functions with power supplies of a lower capacity.
The Eurocom 780W PSU has been tested on your model without the EC Throttling the CPU at heavy GPU+CPU load, so that has been proven to work successfully. The 330W single PSU has not been tested, so it would be "spend your money and try it out", as I am no longer able or capable of helping users with your systems as I do not have access to one. I know how the older MSI laptops worked and how you could change 'EC RAM' registers in RW Everything and use higher wattage power supplies to bypass "EC" based power restrictions (Example: TDP modding a GTX 1070 to 200W to match the GTX 1080 TDP, buying a 330W PSU, and then changing a value in RW Everything (EC) to change the power ID to the GTX 1080 power ID (failure to do this would cause extreme CPU power throttling when 230W power is exceeded, as well as EXTREME battery leech drain), but again, NO ONE has tested what will happen with a GT75 Titan and a single 330W.
If you do decide to experiment for the benefit of the rest of us, please run Throttlestop while gaming on the 330W, make sure your combined system power load does NOT ever exceed 370W from the wall directly (going past this will trip the safety circuit of the PSU; the PSU's are always capable of drawing more total power than their rating), and in Throttlestop, pay VERY CLOSE attention to the "Limit" checkbox, and check for CPU "Power Limit" throttling, e.g. power limit 1/PL2 TDP to 45W.
On the GT73VR SLI system (2x 230W + 2x GTX 1070), unplugging one of the two power supplies and using only one power brick causes extreme CPU throttling (as if somehow the system is able to detect that only 1 PSU is plugged in and then throttles the system at 230W of total AC Power). Usually the 2x230W PSU is rated for 460W of total power. Yet somehow it manages to detect that only one power brick is active. How, I don't know. You would think that having one 230W PSU plugged into the dongle would simply overdraw the PSU (past 230W) and cause the PSU to power down, but instead the CPU just throttles. ( @sirgeorge tested that). That's because, the "master" power ID value for a GTX 1070 system, is "90", whether it's single or SLI, it's still 90 (which means 230W). Something causes this 230W to be doubled if both power bricks are in use.
For example, using a 230W PSU, a TDP modded GTX 1070 (modded to 200W) and changing the master power ID to 330W (ID=91) on a GT73VR, allows the system to try to draw 330W. The 230W PSU will shut off after 250W by the system (>280W from the wall).
Sorry if this confuses you. tl;dr: try it and check "Throttlestop Limit Reasons" button for bizarre CPU throttling if you do.JeanLegi likes this. -
Thanks for the info. I'll see how annoying it is to use the dual supply in the car tomorrow and consider whether I want to make the purchase at that time. I'll let you all know how it works if I decide to pick one up.
-
Hello all, I have several questions. I am looking to buy a N173HHE-G32 panel for a replacement in a different laptop. I need a screen with the lowest response times possible. Notebookcheck has reviewed many laptops with this panel, and nearly all of them had 25ms+ GTG/10ms+ BTW response times
However, the only laptops with this panel that have a decent response time (less than 10ms GTG/6ms) are the GT75 laptops.
I have learned that there are variations in the revisions of the panels, and that the Rev. C2 is a "5ms" panel, and that Rev. C3 is a "3ms" panel (according to MSI, as stated by an MSI employee). Knowing this, I thought that the difference was that the GT75 had C3s and the rest had older C2 panels.
HOWEVER, I found an outlier. The MSI GE73 8RF Raider RGB has a 3ms (as stated by MSI) N173HHE-G32 panel.
According to notebookcheck, the laptop screen has a GTG response time of 28ms and a BTW response time of 13.4ms, similar to the all the rest of the N173HHE panels. All the other stats (color, etc.) are nearly identical between the titan panels and this one's.
I wish to get a panel with response times less than 10ms.
My questions are:
Do the GT75 models come shipped with a Rev. C3 panel?
And, do the GT75 panels actually have a 10ms GTG response time as reported by notebookcheck?
Why would the panel in the GE73 8RF have a twice as high response time if it uses the same "3ms" N173HHE-G32 panel?
Thank you for reading, I hope someone with some technical knowledge can chime in. -
Yes, there are multiple revisions for the n173hhe-g32 panel and there are differences in specs and resp. times between revisions. There are slight differences (regardless of the panel brand or manufacturer) in specs even for the same panel with the same revision.
Depending on the shade, some transitions will be faster, some will be slower.
The reason for that difference in response times, is due to the fact that the manufacturers advertise the fastest speed for a specific transition (they could have cherry picked the time for the fastest transition from all the tested transitions, or it could be that they advertised the BtW time), while NBC provides the rise and fall transition times for BtW and for 50% grey to 80% grey.
For. eg: you stated that, the 5ms resp time (the transition for that resp time isn't provided, but it would seem to correspond to BtW) corresponds to the C2 rev, but in NBC, that panel had 4ms rise for BtW and 3ms fall WtB, while the GtG (50% to 80%) values were 12ms/14ms.
https://www.msi.com/blog/Introduction-to-MSIs-120Hz-5ms-3ms-gaming-panel
Donald@Paladin44 likes this. -
Is the GPU upgradeable in the GT75 TITAN-057?
Since the screen is TN, how bad is the viewing angle?
I've heard the gt63 has bad built quality. Is the GT75 much of an improvement, anybody flex?
Why isn't there an option for the i9 CPU with gtx1070 graphics since the CPU is soldered on and can't be upgraded later ?n.
Does thCPUpu throttle at all at lax clock speed running 5 plus hours?
I originally intended to go for a slim gaming laptop like the asus Zephyrus m or aero 15x but after hearing the CPU throttle, I'm going to go with a big and bulky system. -
So, do you know what panel I would have to buy to get the reported response times of the GT75 laptops?
Is it a N173HHE G32 Rev. C3? -
I think that only msi would know the answer tot that question, but in theory yes, the latest rev should be the one with better response times.
-
Would I be silly to even consider buying one with the single gtx1070 and i7 model due to budget restraints when there are other alternatives out there with the exact same specs but in a slim portable form factor like the Asus Zephyrusm aero 15x, MSI stealth or is this the only 8th gen laptop that actually that has enough cooling to adequately run the new 6 core CPU without throttling and for this reason is worth it?
Does the base i7 generate significantly less heat than the i9 CPU?
For someone doesn't do any overclocking, would the i7 be overkill for my needs?
I've heard even the cooling on the system still isn't enough with the i9. Is this true? -
Falkentyne Notebook Prophet
The i7 and i9 chips are exactly the same, besides i7 being more locked down and a lower quality silicon bin tier. Heat produced is a function of voltage and amps, so an i7 and i9 6 core CPUs running at 3.4 ghz @ 1.1v will both produce exactly the same heat, if of course the default VID is identical for the same speed preset. The difference is, the i9 is fully unlocked while the i7 isn't, so the i9 will thus potentially be able to clock higher (and thus run hotter as a result).
The silly and confusing thing about this i7 vs i9 drama is that the i9 + GTX 1080 uses 2x230W PSU's (the same configuration that the old GT73VR 1070 SLI 4 core system used), while the i7 uses a single 330W PSU. And there is no i7 GTX 1080 configuration (as far as I know), and there is also no i9 + GTX 1070 configuration.
I think MSI's hardware team is overdosing on marijuana, because they could have easily produced a i9 + GTX 1070 configuration and bundled it with a single 330W power supply easily.
Perhaps they thought that customers would think a 115W GPU would be useless when paired with an unlocked 6 core CPU, and that the 115W GPU would be best paired with the lower tier locked/partially locked down chips, especially this far into the GTX 1080/Pascal's lifespan. And TDP modding the 6 core version of the GPU's on these systems is currently impossible.
To be honest, a 115W GPU paired with a 6 core fully unlocked CPU looks sort of stone age, whether it's a 8700K or 8950HK.raz8020 likes this.
*** The Official MSI GT75 Owners and Discussions Lounge ***
Discussion in 'MSI Reviews & Owners' Lounges' started by Spartan@HIDevolution, Jun 23, 2017.