LPCDIGITAL has been a Prema partner for many years...
-
Larry@LPC-Digital Company Representative
-
atquantrandash93 Notebook Consultant
@Larry@LPC-Digital : Great! Thanks Larry!
-
Meaker@Sager Company Representative
-
-
atquantrandash93 Notebook Consultant
Hi guys,
How good is the 144Hz and 300Hz panel? I read that they each have a 9ms and 8ms response time, respectively. How does that affect gameplay and day-to-day usage, e.g. web browsing, watching videos, editing documents, etc.?
Sent from my LM-V450 using Tapatalk -
-
atquantrandash93 Notebook Consultant
That's true. How do they compare to a 3ms or 1ms respond time screen? I'm asking because I've seen neither in person, and if I'm paying over $3,000, I need to know as much as possible before the purchase.
Sent from my LM-V450 using Tapatalk -
Papusan, dmanti and Donald@Paladin44 like this.
-
atquantrandash93 Notebook Consultant
Nice! Thanks!
-
-
atquantrandash93 Notebook Consultant
Could you share where you bought your X170, and what the specs are? LPC-Digital said the 144Hz and 300Hz panel for the X170KM-G with RTX 30xx have respectively 9ms and 8ms response time, while yours has only 5ms.
Here are the links from LPC-Digital:
144Hz: https://www.panelook.com/B173HAN04.0__17.3__overview_37192.html
300Hz: https://www.panelook.com/B173HAN05.1_AUO_17.3_LCM_overview_44168.html
Your lg philips lp173wg-spb1: https://www.panelook.com/LP173WFG-SPB1_LG Display_17.3_LCM_overview_44065.html -
Larry@LPC-Digital Company Representative
Please note:
Due to the supply shortages for every component is really bad, including screens, and they can change up at any time.
We cannot totally confirm exact screens being used during this severe situation. Sorry.dmanti likes this. -
atquantrandash93 Notebook Consultant
That's tough. Thanks Larry, for the update!
-
Meaker@Sager Company Representative
Also the headline response figure is hard to compare panels with due to differences in measurements and Vs grey to grey performance.
-
atquantrandash93 Notebook Consultant
I guess there's more to read than response time only.
Sent from my LM-V450 using Tapatalk -
Has anyone here tried to OC their i9-10900k in clevo laptop with Prema Custom BIOS by BIOS or Intel Extreme Tuning Utility? I would like to see some results from specifications that you have used in XTU or BIOS. I will get my new X170KM-G from hidevolution in a week and i wanted to prepare myself for overclocking this beast machine as good as I can Maybe someone else is using the same machine right now (liquid metal on CPU).
-
So my KM-G arrived a few days ago. Purchased from GentechPC. Shipped straight from Sager since I didn't do any modifications outside the norm. Took 5 1/2 business days from time of order to arrive at my doorstep (in same state), super quick. Unfortunately arrived w/ some cosmetic defects on the part of the chassis which would require a complete tear down.
Currently in communication w/ Ken @GenTechPC who's been very helpful, so hopefully Sager makes this a painless process in either getting a new replacement or refund (so I can order again).
Outside of that, runs amazing after you dial in the settings etc
Edit: Btw, for those not going the Prema route, these 3200mhz CL18 1.2v (JDEC) work great in it. 2x32gb (dual rank).
https://www.amazon.com/gp/aw/d/B08LL2NC1H/ref=yo_ii_img?ie=UTF8&psc=1Last edited: Apr 6, 2021 -
win32asmguy Moderator Moderator
raz8020 and Spartan@HIDevolution like this. -
Could you please test if lcd overdrive is active like it is at the SM or Clevo finally disabled it?
Please run this test while the monitor is running with G-Sync enabled and max refresh rate:
https://www.testufo.com/ghosting
Please make a picture and if you can do a video (at least 60 fps setting on the camera) of the laptop screen where all 3 lines are visible. Please upload the picture and video somewhere where there is no quality loss.
Thank you! -
I've been mainly using throttlestop & haven't touched the cancerous OC area in CCC. Just using it for fan curves & RGB. I'm mainly gaming on it so I dropped PL 1 to 130 & PL2 150 in TS, disabled the power limiters etc. Undervolted -125mv on core, cache -40mv. Maintains all core 4.8ghz w/ ease, while keeping the 5ghz+ on lower core loads. GPU OC @ +170/+1200. Temps in GPU bound games on auto fans is around 56c-60c gpu/60c cpu. Noise has been tolerable, definitely doesn't reach max fan status.
One thing to note, in GPU bound games, your CPU will drop to it's base all core clock due to how Dynamic Boost 2.0 works. So w/ mine (10850K) it's 3.6ghz (Shadow of the Tomb Raider) as it's pulling close to 165w on GPU (pegged @ 99% gpu utilization, 1950-1965mhz). For say cpu bound games like Battlefield 1/V multiplayer, the cpu doesn't drop & maintains my all core 4.8ghz throughout w/ GPU hovering around 150w (only was at 65-70% gpu utilization, game kept hitting the frame limit of 200fps @ max settings ). GPU 55c/CPU 70-78c (cpu pulling 80-90ish watts) w/ auto fans. Max fans, the CPU wouldn't touch 70c.
The 165w 3080 is definitely a beast in games, especially w/ RT.
In my experience, OC to OC, the 165w 3080 is about 15% faster than the 200w 2080 Super in rasterization games @ 1080P, around 20% at higher resolutions. In RT games it's around 20% faster, more at higher resolutions.
I planned to do a bunch of YT game videos, but that'll have to hold on until I get my replacement
Here's a Port Royal benchie though, good enough for #1 laptop 3080 score (w/ normal game OC +170/+1200)
https://www.3dmark.com/pr/981602
Last edited: Apr 6, 2021raz8020, DaMafiaGamer, electrosoft and 6 others like this. -
Is there any way to disable that CPU clock drop? That could have negative effects in anything that stresses both the CPU and GPU. -
Edit: BTW, it disables resizeable bar too. Though obviously that wouldn't matter in stuff outside of games, where both are being hammered.DaMafiaGamer and Clamibot like this. -
win32asmguy Moderator Moderator
-
Last edited: Apr 6, 2021
-
yrekabakery Notebook Virtuoso
The throttling boosting algorithm is not ideal in games that stress both CPU and GPU at the same time like Battlefield, CoD/Warzone, Cyberpunk, Assassin’s Creed, etc.
-
Warzone is extremely CPU bound as well this, the CPU runs full bore since the utilization on the GPU is low, thus the CPU runs full speed to try & get the GPU utilization up. It does what it needs to do.
Cyberpunk is GPU bound, at least on my OC/UV settings w/ 10850K/3080. Pegging 97-99 GPU utilization at nearly 160-165w w/ max settings/DLSS Quality/RT Psycho, including driving. Because of this, the CPU down clocks to give more wattage to GPU.
Assassin's creed is a terribly optimized game in the same fashion as Far Cry 5 on all systems. You could be at 95% utilization on GPU & use a lot less wattage than another game at the same GPU utilization . Ubisoft really needs to fix their inefficient engines.
The 10850K/3080 combo @ 1080P is definitely overkill for most games. You will be CPU bound (@ full speed) before needing that +15 on the GPU. Hooking up a 4K monitor? CPU will less of factor as you'll hit GPU bound finally requiring those 165w & you'll see nearly all recent games become GPU bound.
Turning boost off in games won't increase your FPS w/ this set up @ 1080P. Just basing that off my experience. Not to mention it disables resizeable bar. It's on when it needs to be & off when it's not. The algorithm is surprisingly smart
Not defending Nvidia implementing boost 2.0 vs just straight up making it 165w, but just some helpful info thought I'd share.
Edit: Added a couple photos, same map, kept dying trying to screenshot
BFV @ 1080P maxed out, multiplayer (64 player conquest)
BFV @ 1080P maxed out, multiplayer (64 player conquest)
BFV @ 4K maxed out, multiplayer (64 player conquest)
Last edited: Apr 6, 2021raz8020, DaMafiaGamer, electrosoft and 4 others like this. -
Kp86 likes this.
-
I also noticed that it is not that hard to run the 10850k at 4.8 GHz on all cores with a bit of help from Throttlestop. Throttlestop showed a peak power consumption of up to 165W with the 10850K running on all cores in Time Spy.Kp86 likes this. -
https://www.3dmark.com/spy/16605937
And it still has a long way to go...
@DRevan
Informed Clevo about the issue way back when you first experienced it, but given their track record in listening to feedback, I wouldn't count on it being fixed in the KM (there is at least no such toggle in BIOS or CCC).
Edit: Sorry, don't have a high refresh screen to test this myself. Could someone please help this guy?!
http://forum.notebookreview.com/thr...2m-owners-lounge.835639/page-17#post-11088323Last edited: Apr 7, 2021raz8020, DaMafiaGamer, hacktrix2006 and 5 others like this. -
So guys what do you think is the best bang for the buck for km? 10850 with the 3070? Looks like it will be hard to get a 3xxx in my P775TM, so i was thinking of selling my P775 and get a KM, still have time, want one with that 165 Hz QHD
BrightSmith likes this. -
BrightSmith Notebook Evangelist
If you're upgrading from a gtx1080 the 3070 makes sense. The 3080 won't give you much added value for the price imo. I wouldn't advise upgrading from a 2080, unless money is not a factor. Then I would go for the 3080 of course. If multithread CPU power is what you're after the 10900K will be King even in the Rocket Lake era.
Cylix101 likes this. -
Yeah from the 1080, yes the difference, money wise from 3070 to 3080 is big, to big for me anyway. Yes i need the threads so a 10th gen is the way, but is the 10900k so much better as the 10850k?
Was thinking to buy it as a barebone, get it with the 3070, after buy a 10850 from ebay or from a good promotion, and il port some of the SSDs from my old p775, dont need to sell it with 4 ssds inside and even take the ram from the old oneBrightSmith likes this. -
Also there seem to be certain cases where my 10850K goes down to its nominal speed without turbo so I would NOT go with a non-K CPU as that may mean 2.8 instead of 3.6 GHz which is not really that great.
Please note that you will not be able to use 4 of the m2 form factor SSDs with a 10th gen. CPU as the fourth slot will only work with a Rocket Lake CPU. You probably know that there is no 2.5" slot any more but I mention it just in case. -
win32asmguy Moderator Moderator
Although honestly any kind of CPU limitation is not necessary for the given GPU boost and goes against the spirit of the machine. I am guessing it just has to happen this way since Nvidia does not want anybody getting ahold of a signed vbios that can run at 165W TGP bypassing the boost system. -
raz8020, DaMafiaGamer, 1610ftw and 2 others like this. -
Has anyone tried the new 3080 MXM gpu in their x170sm-g yet? Does it work with the prema mod bios or is there a new firmware needed to make it work? I am wondering if a gpu and heat sink only swap is possible on the previous generation x170sm-g.
-
win32asmguy Moderator Moderator
For games it does not seem to matter much between 10900k or 11th gen, either can keep up just fine. I also do like the idea of having the 4th dedicated PCIe slot working (so much that it seems like X170SM may be a better choice for pairing with a 10th Gen CPU). -
With everything being so hard to get at the moment the chances to get the SM-G at a good price with a new 30xx card and the new heatsink in a nice turnkey package seem to be rather slim compared to that. -
Something from xmg regarding the low watt 3080:
Apart from a successor (or refresh) of the XMG APEX 15 with a maximum RTX 3070, no other model with a AMD desktop CPU is planned at the moment. There are several reasons for this, including the fact that none of the three big chipvendors (AMD, Intel, NVIDIA) is particularly enthusiastic about supporting such niche products. At Intel, support for the XMG ULTRA platform and its various predecessors (X7200, P570WM, P775TM etc.) at least has a long tradition and well-rooted networks in Taiwan and currently still relatively secure supply with CPUs and chipsets, minimising the risk of all those nice development costs being wasted by a paper launch.
And yes, NVIDIA also plays a role here - after all, every GPU product has to go through the usual Greenlight process, and in the case of a notebook, the overall system (motherboard, CPU, power budget) also plays a role. Other ODMs (I'm not naming any names) have already tried to put systems with desktop CPUs (regardless of whether they are AMD or Intel CPUs) into development and haven't got very far despite already having finished, sensible designs for board layouts.
On this occasion, one can also ask why an RTX 3080 in the XMG ULTRA 17 is limited to 165W TGP, when the RTX 3070 in the desktop is at 220W TGP.
XMG ULTRA with i7-11700K and RTX 3080
Unigine Superposition 4K Stress Test
GPU Power (sustained): 160W
GPU Clock (sustained): 1677MHz
GPU Temp (sustained): 63°C
As you can see in the GPU stress test, the graphics card with the cooling system of the ULTRA 17 still has a lot of thermal headroom: 63°C in the stress test with a maximum GPU temperature target of 87°C allowed by NVIDIA.
The answer is as so often: the manufacturers optimise for mass, and the masses in the laptop sector want thin & light, so there is simply no board layout from NVIDIA for laptops in this generation that supports 200W. Whether this will change again in the future remains to be seen.
Unfortunately, this is all much, much more complicated than one can (or would like to) imagine as an end customer, and unfortunately we can't talk about it very openly without breaking various NDAs.
Cheers,
Tom -
Meaker@Sager Company Representative
-
I think it's a hard choice. On one hand, buying Skylake in 2021 seems utterly ridiculous - it is 6 years old . The latest mobile phone cores are creeping on it. On the other hand, it seems something was lost in the backporting to 14nm and Rocket Lake just isn't that impressive. Given how disappointing mobile Ampere is, I think it's better to just skip this gen altogether unless you're upgrading from something older then Coffee/Comet Lake and Turing. -
I hope that a year from now I will be proven wrong but at the moment things do not look too good with Nvidia preventing the X170 to make use of its superior cooling capabilities and who knows how good Elder Lake will really be when it gets released. The extremely underwhelming Rocket Lake launch for now looks more like an uncontrolled mid-air explosion...
BrightSmith likes this. -
Thanks for trying to find worthwhile improvements despite Nvidia effectively sabotaging more capable and better designed laptops with lower TDP's and Dynamic Boost 2.0. -
-
win32asmguy likes this.
-
win32asmguy Moderator Moderator
Last edited: Apr 8, 2021 -
hacktrix2006 Hold My Vodka, I going to kill my GPU
@MD9787: anybody located the bios chips already? Tried to find it. Maybe i‘m just blind [/QUOTE]
Give me a mainboard image of both sides and i will be able to find it, must be HQ images though so i can really zoom in. -
DaMafiaGamer Switching laptops forever!
@Kp86 just wanted to ask you if your RTX 3080 has the same hardware ID as mine (you can check in device manager).
Mine is as follows:
PCI\VEN_10DE&DEV_ 249C&SUBSYS_16021043&REV_A1
Does your RTX 3080 also have 249C as the identifier? If so I could potentially use that 165w vbios
*** Official Clevo X170KM-G/Sager NP9672M Owner's Lounge ***
Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by win32asmguy, Mar 23, 2021.