There is a MASSIVE difference in speed between SSD and NVME. You might wanna check out why that is and then decide from there.
-
Shyt I play IS CPU and GPU intensive so, we'll see how mine compares. I'm curious about your screenshots n why would u max ya fans if ya shyt isn't running hot to begin with? U like the extra fan noise? If my CPU idles at 50 and isn't taxing any of the games I push, fans would be on low...
I'll be happy with a 4.8ghz @ 60ish c's and runs around 80c with max fans on playing a game like Destiny 2 or The Witcher. U got dem "damn near 2 good to b true" temps right now.. I'm especially curious about those screenshots now...
Sent from my SM-G950U using Tapatalk -
This is after about a half hour of Overwatch.
-
Sent from my SM-G950U using Tapatalk -
Who the hell would run fans on high when the machine is idle? I run fans on high during gaming sessions. I use headphones for sound so I couldnt care less what the machine sounds like when I'm gaming. The rest of the time the fans are on auto.
Last edited: Dec 9, 2017 -
@Ogg and this is without VC? i wonder how it will looks like when it finally arrives. Keep us updated
-
Got the following installed from the OEM:
Silicon Lottery Binned,
Delidded and Tested,
Factory Overclocked 8th Generation Intel Core i7-8700K 6 Core-12 Thread Core Processor,
4.7 GHz (HIDevolution Overclocked to 4.9GHz, rated to 5.1GHz) - GUARANTEED Performance (Thermal Grizzly Conductonaut) -
-
Thanks @Prema for the suggestions!
3DM11P - https://www.3dmark.com/3dm11/12537394
3DM11X - https://www.3dmark.com/3dm11/12537384
*Spent the rest of the day figuring out how to make Catzilla work. It turns out that installing the Z170 chipset from my P870DM2 and transferring to the P870TM did the trick.
Catzilla 576p
Catzilla 720p
Catzilla 1080p
Catzilla 1440p
Catzilla 4K is grayed out and I'm unable to test.
@D2 Ultima @Papusan , 4.5Ghz is doable after all. Temps are between 77-79C (fans on OC profile, 1.14V). Sorry I panicked a little early. -
Bout to hit them up and up my chip...
Sent from my SM-G950U using Tapatalk -
SO, after some new information, I'm at a crossroad.
My CPU is paid for but hasn't been built yet. This will mainly be for gaming both on the laptop monitor and out to a 34" 3k Widescreen Monitor @ 120hz GSync. Here is my issue...I want the performance but I'm trying to maintain decent thermals. I also want the upgradability. (Provided the next chipset is viable and compatible with the Coffelake MB)
Of the two options below, which would you choose and why? Which one would you choose and why? They both would be around the same price as I would add additional options like a longer warranty, RAM and RAID M.2 SSD's to option "A".
A - Silicon Lottery Binned, Delidded and Tested, Factory Overclocked 8th Generation Intel Core i7-8700K 6 Core-12 Thread Core Processor, 4.7 GHz ( Overclocked to 4.9GHz, rated to 5.2GHz) & NVIDIA GeForce GTX 1080 8GB GDDR5 X 200W + Vapor Chamber GPU & CPU Heat Sink Set - NVIDIA G-Sync Enabled
OR
B - Delidded - Unlocked, Under Volted and Overclocked 8th Generation Intel Core i7-8700K 6 Core-12 Thread Processor, 4.7 GHz ( Overclocked to 4.8GHz) & DUAL NVIDIA GeForce GTX 1080 8GB GDDR5 X 200W + Vapor Chamber GPU & CPU Heat Sink Set - NVIDIA G-Sync Enabled -
GameServ likes this.
-
Definitely option B.
I personally enjoy the 2 1080's over just one. Sure SLI isn't supported by every game, but it's still more horsepower for the ones that do support it, and just in general. Plus, I think you'll be battling cpu temps with anything over 5ghz, just my opinion. Yeah, that particular chip may be silicon lottery, but the gains from 4.8ghz to 5.0ghz for gaming are going to be rather minimal. Not worth it to me considering once again, I feel like you'd be battling temps at that point. Plus OC'ing is something you can save your money on and do it yourself man.
Ask, but they may also let you buy the warranty after the purchase too...if that may be more feasible for you later on. It's worth asking anyway.
Like I was telling you yesterday, you got the same build as me. I'd stick with that.Last edited: Dec 9, 2017GameServ likes this. -
Forget the silicon lottery one, take two cards if they come out to the same priceGameServ likes this. -
Thx fellas
-
Agree with others go sli. Imo Not worth all that cash for .2 higher overclock that will keep ur processor cooking. Much rather have another gpu
GameServ likes this. -
@Papusan
@Mr. Fox
@Meaker@Sager
@D2 Ultima
and anyone else with knowledge on this,
Correct me if I'm wrong, but I think I have finally done enough research to understand the pcie lanes stuff. Would just like some extra eyes to make sure I understand this right before I order my ssd to put in this laptop.
The CPU provides the GPU's with 16 lanes and the mobo chipset provides 24 lanes, but only 8 lanes useable(since something on intel's site said that the mobo would only have as many lanes as the cpu lanes, that part still confuses me.)
So, the 16 lanes coming off my 8700k will go to my SLI cards, both cards at x8 performance.
Then, adding my M2 pcie ssd is going to take up 4 lanes, leaving me with 4 lanes left over, for another M2 pcie ssd or whatever else, in the future. However I probably wouldn't add another M2 pcie ssd beyond 1, because of possibly being bottlenecked by the DMI link between the chipset and the cpu since it only uses 4 lanes for that link.
Did I get that right fellas?
Also, is there a motherboard manual for the motherboard in the P870TM1(beyond just the laptop manual I received from HID)? I'd like to know which, if any, sata ports would get turned off if I did go this route and add an M2 pcie ssd.Last edited: Dec 9, 2017 -
reason being volta mobile GPU might just be around the corner and no point paying top dollar for something almost 2 yr old architecture. volta 12nm with HBM2 titan GPU just out iirc so theres something that will soon follow it. on the other hand, no doubt option B will get you a CPU but you don't know if it'll be the lowest end of the binned side, 4.8ghz might really be all you can get and no more, which would mean high temps high voltage. going for A ensures you of a much better CPU while leaving you room to upgrade on GPU (assuming clevo doesnt screw around again).GameServ likes this. -
Meaker@Sager Company Representative
The DMI 3.0 link between the CPU and chipset is effectively 4x PCI-E. Any link you see from the CPU is independent, everything that is connected to the chipset has to come back across the DMI 3.0 bus to the CPU.wtjwillis likes this. -
-
Meaker@Sager Company Representative
24 lanes which at any point can't exceed a total of 4 lanes bandwidth total. As I said all devices on the chipset have to talk to the CPU so are limited by the DMI bus.
wtjwillis likes this. -
-
Meaker@Sager Company Representative
Also you need a specific workload to worry about 4x bandwidth to be a limit for you.
-
-
Meaker@Sager Company Representative
Adding in switches costs money and a full x16 connection offers no benefit in single card mode for that cost.
-
temp00876 likes this. -
Single GPU though, doesn't even matter if it's a 4x lane width on PCI/e 3.0, it's enough. -
Donald@Paladin44 Retired
However it won't have G-Sync. According to @Prema G-Sync will be stopped at the driver level.
Other than @SirSaltsAlot, is any else interested?Spartan@HIDevolution, Papusan, SirSaltsAlot and 1 other person like this. -
-
-
If you check the final section in my SLI guide there's a few links to tests with Witcher 3/Rainbow 6 Siege/etc and even some other games. It has been an issue since Maxwell, and every SLI-capable Pascal generation card suffers from it.
Benchmarks have no issue because they are designed to NOT stress inter-GPU bandwidth with such tech. Same with games that are designed to be AFR-friendly, like Sniper Elite 4 (which is what all games should strive to do, honestly). You wouldn't even notice that you'd have a problem unless you had the ability to compare lane for lane. So someone with say, a 5960X would be able to test by using PCI/e 3.0 x16/x16 and PCI/e 2.0 x16/x16 (since 2.0 is roughly half the bandwidth of 3.0, it emulates a PCI/e 3.0 x8/x8 connection), and then test the different games.
The need for bandwidth is so bad that some games even exhibit negative scaling (especially at higher resolutions) with the flex bridge. The important thing to note is that GPU utilization is not an indicator of positive or negative scaling with complete certainty. Sometimes a bad SLI profile will not let your utilization % cross about 50% for anything but short bursts and that's an indicator of negative scaling, but other times you will be getting a solid 90%+ utilization on each card and simply not be getting proper scaling % (scaling % being the % increase over single card, so 60fps on one card and 78fps on two cards is low scaling %, whereas 110fps on two cards is much better and more desirable scaling %).
For my favourite example, non-HB bridge on x8/x8 grants negative scaling in Fallout 4 at 4K:
I can also attest that I got very low scaling when i tried fallout 4 on my system, consistent with the HB bridge on x8/x8 here. And then you add more PCI/e lane bandwidth and...
-
Sent from my LG-H932 using Tapatalk -
-
Oh yeah, one more thing @Mr. Fox, a piece of info you'd love. Apparently when the HB bridge came out, it added even more bandwidth to multi-GPU than it does now. In a driver series somewhere around 372.54 (or one or two before it), they forced the excess bandwidth over the LED bridge into frame pacing (making mGPU smoother). This effectively hurt scaling and made x16/x16 more important than it was on release. Welcome to Nvidia.
And since the Titan Volta launched, with NVLink connectors, people reached out asking about NVLink on bridge-only format for SLI. They were told that SLI doesn't work on NVLink and they didn't even turn on the ability to use multi-GPU rendering via NVLink... so basically, it's even worse. Not that they're "not giving us" NVLink in bridge-form for consumers, but rather it's being funneled completely away from consumers, where it would fix the entire bandwidth issue available right now and even re-allow 3-way and 4-way SLI to function better than ever before. -
I'll probably go with a 1TB nvme ssd then now that I know if won't affect my GPU lanes and bandwidth and that the DMI link can handle at least one nvme. Doubt I'll ever install more than one though knowing about that DMI bottleneck now. Thanks for the clarification man, I appreciate it. -
I'll probably only ever do one though, since I finally understand the DMI link bandwidth limit and I don't want to limit other items coming across that link. I'm ocd with that stuff lol, I try to be efficient as possible haha. I'm weird, I know. Anyway, Thanks guys. Extremely appreciated. -
Meaker@Sager Company Representative
You don't ever max the storage anyway so even if you got 3x NVME and have 3.5GB/sec read and write your other devices would not suffer.
-
Meaker@Sager Company Representative
I think der8auer got 26GB/sec with his testing on 8x drives in the beta firmware.
-
-
Meaker@Sager Company Representative
But why would a consumer want to go to the levels of bandwidth you are talking about?
Raid will always add a latency penalty too. -
Sent from my LG-H932 using Tapatalk -
FredSRichardson Notebook Groundsloth
Forgive me if this is a niave question. Will this laptop run Linux? This is mainly for numerical processing on the GPUs (via CUDA) and CPU cores. I have heard there is a hot for Linux vs Windoze with CUDA. Any insight for this particular model would be great.
Sent from my Pixel 2 XL using TapatalkLast edited: Dec 11, 2017 -
Though I havent tried it myself there is absolutely no reason the P870TM wouldnt run Linux. As a HUGE bonus there is no garbage iGPU to get in the way!!
FredSRichardson and Papusan like this. -
Meaker@Sager Company Representative
You can turn off secure boot so yes.
-
Picking up my TM1 from FedEx tonight. I'm glad it came 2 days early since I leave the country in a few days. How does the single GPU vapor chamber differ from the dual Gpu vapor chamber? Or are they the same part minus the second gpu?
Last edited: Dec 11, 2017GameServ likes this. -
Falkentyne Notebook Prophet
Didn't the single card version have two fans, one for the GPU heatpipes and the second for the MXM VRM heatpipes, then the third fan for the CPU itself? While the SLI version has the same two fans, but has a block type of assembly that goes over both GPU's (the heatpipes are barely visible from top on that one)?
-
Papusan, monkeysystem and SirSaltsAlot like this.
-
FredSRichardson likes this.
-
*** Official Sager NP9877 / Clevo P870TM-G Owner's Lounge! - Phoenix 4 ***
Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by Spartan@HIDevolution, Oct 5, 2017.