The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    *** Official Clevo X170KM-G/Sager NP9672M Owner's Lounge ***

    Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by win32asmguy, Mar 23, 2021.

  1. Larry@LPC-Digital

    Larry@LPC-Digital Company Representative

    Reputations:
    3,952
    Messages:
    3,580
    Likes Received:
    283
    Trophy Points:
    151
    LPCDIGITAL has been a Prema partner for many years... :)
     
    Prema, Clamibot, electrosoft and 2 others like this.
  2. atquantrandash93

    atquantrandash93 Notebook Consultant

    Reputations:
    39
    Messages:
    115
    Likes Received:
    35
    Trophy Points:
    41
  3. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,188
    Likes Received:
    17,895
    Trophy Points:
    931
    There are 3 hooked up to the chipset which work either way and 2 can do SATA and 3 are PCI-E. The 4th one will work only for 11th gen and only with PCI-E as it is hooked straight into the CPU (like the AMD do).
     
  4. 1610ftw

    1610ftw Notebook Evangelist

    Reputations:
    266
    Messages:
    462
    Likes Received:
    517
    Trophy Points:
    106
    When AMD do that on a desktop mobo I can also connect about a dozen drives in other ways - here I am now stuck with three drives instead of 4. I happen to like having a 10 core CPU and thanks to the infinite wisdom of the wizards at Intel this generation will have two cores less. But who cares because if I find that one application that only needs 2 cores at a time and manage to switch everything else off I might be able to go faster - can't wait for that to (not) happen.
     
  5. atquantrandash93

    atquantrandash93 Notebook Consultant

    Reputations:
    39
    Messages:
    115
    Likes Received:
    35
    Trophy Points:
    41
    Hi guys,

    How good is the 144Hz and 300Hz panel? I read that they each have a 9ms and 8ms response time, respectively. How does that affect gameplay and day-to-day usage, e.g. web browsing, watching videos, editing documents, etc.?

    Sent from my LM-V450 using Tapatalk
     
  6. Entropytwo

    Entropytwo Notebook Consultant

    Reputations:
    70
    Messages:
    299
    Likes Received:
    216
    Trophy Points:
    56
    It's good if you compare it to what lenovo (40ms) or other manufactures throw out (18~24)
     
  7. atquantrandash93

    atquantrandash93 Notebook Consultant

    Reputations:
    39
    Messages:
    115
    Likes Received:
    35
    Trophy Points:
    41
    That's true. How do they compare to a 3ms or 1ms respond time screen? I'm asking because I've seen neither in person, and if I'm paying over $3,000, I need to know as much as possible before the purchase.

    Sent from my LM-V450 using Tapatalk
     
  8. Entropytwo

    Entropytwo Notebook Consultant

    Reputations:
    70
    Messages:
    299
    Likes Received:
    216
    Trophy Points:
    56
    It depends, if you are a professional csgo player at world cups you might notice it, otherwise you won't.
     
    Papusan, dmanti and Donald@Paladin44 like this.
  9. atquantrandash93

    atquantrandash93 Notebook Consultant

    Reputations:
    39
    Messages:
    115
    Likes Received:
    35
    Trophy Points:
    41
    Nice! Thanks!
     
  10. Joe4zio

    Joe4zio Notebook Consultant

    Reputations:
    39
    Messages:
    220
    Likes Received:
    114
    Trophy Points:
    56
    I got the 144hz, lg philips lp173wg-spb1, I would consider it a good panel to be honest, if it weren't for the fact i got some dead / stuck pixels I can't get unstuck and backlight bleed isn't the greatest, but I can't count for much , this is the second time only I'm experiencing over-144hz monitors.
     
  11. atquantrandash93

    atquantrandash93 Notebook Consultant

    Reputations:
    39
    Messages:
    115
    Likes Received:
    35
    Trophy Points:
    41
    Could you share where you bought your X170, and what the specs are? LPC-Digital said the 144Hz and 300Hz panel for the X170KM-G with RTX 30xx have respectively 9ms and 8ms response time, while yours has only 5ms.

    Here are the links from LPC-Digital:
    144Hz: https://www.panelook.com/B173HAN04.0__17.3__overview_37192.html
    300Hz: https://www.panelook.com/B173HAN05.1_AUO_17.3_LCM_overview_44168.html
    Your lg philips lp173wg-spb1: https://www.panelook.com/LP173WFG-SPB1_LG Display_17.3_LCM_overview_44065.html
     
  12. Larry@LPC-Digital

    Larry@LPC-Digital Company Representative

    Reputations:
    3,952
    Messages:
    3,580
    Likes Received:
    283
    Trophy Points:
    151
    Please note:
    Due to the supply shortages for every component is really bad, including screens, and they can change up at any time.
    We cannot totally confirm exact screens being used during this severe situation. Sorry.
     
    dmanti likes this.
  13. atquantrandash93

    atquantrandash93 Notebook Consultant

    Reputations:
    39
    Messages:
    115
    Likes Received:
    35
    Trophy Points:
    41
    That's tough. Thanks Larry, for the update!
     
  14. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,188
    Likes Received:
    17,895
    Trophy Points:
    931
    Also the headline response figure is hard to compare panels with due to differences in measurements and Vs grey to grey performance.
     
  15. atquantrandash93

    atquantrandash93 Notebook Consultant

    Reputations:
    39
    Messages:
    115
    Likes Received:
    35
    Trophy Points:
    41
    I guess there's more to read than response time only.

    Sent from my LM-V450 using Tapatalk
     
  16. Cedr1k

    Cedr1k Notebook Enthusiast

    Reputations:
    5
    Messages:
    18
    Likes Received:
    1
    Trophy Points:
    6
    Has anyone here tried to OC their i9-10900k in clevo laptop with Prema Custom BIOS by BIOS or Intel Extreme Tuning Utility? I would like to see some results from specifications that you have used in XTU or BIOS. I will get my new X170KM-G from hidevolution in a week and i wanted to prepare myself for overclocking this beast machine as good as I can :) Maybe someone else is using the same machine right now (liquid metal on CPU).
     
  17. S.K

    S.K Batch 80286

    Reputations:
    711
    Messages:
    1,212
    Likes Received:
    1,639
    Trophy Points:
    181
    Blow torch works great too.
     
    Clamibot, Papusan and jc_denton like this.
  18. Kp86

    Kp86 Newbie

    Reputations:
    0
    Messages:
    8
    Likes Received:
    23
    Trophy Points:
    6
    So my KM-G arrived a few days ago. Purchased from GentechPC. Shipped straight from Sager since I didn't do any modifications outside the norm. Took 5 1/2 business days from time of order to arrive at my doorstep (in same state), super quick. Unfortunately arrived w/ some cosmetic defects on the part of the chassis which would require a complete tear down.

    Currently in communication w/ Ken @GenTechPC who's been very helpful, so hopefully Sager makes this a painless process in either getting a new replacement or refund (so I can order again).

    Outside of that, runs amazing after you dial in the settings etc

    Edit: Btw, for those not going the Prema route, these 3200mhz CL18 1.2v (JDEC) work great in it. 2x32gb (dual rank).

    https://www.amazon.com/gp/aw/d/B08LL2NC1H/ref=yo_ii_img?ie=UTF8&psc=1
     
    Last edited: Apr 6, 2021
    dmanti, Papusan and Clamibot like this.
  19. win32asmguy

    win32asmguy Moderator Moderator

    Reputations:
    1,012
    Messages:
    2,844
    Likes Received:
    1,699
    Trophy Points:
    181
    How has the fan noise been with a stock configuration out of the box? Were stock power limits also set to 130W PL1 / 250W PL2 as it is with the XMG Ultra 17 in Performance mode?
     
    raz8020 and Spartan@HIDevolution like this.
  20. DRevan

    DRevan Notebook Virtuoso

    Reputations:
    1,150
    Messages:
    2,461
    Likes Received:
    1,041
    Trophy Points:
    181
    Nice! What screen did you purchase?
    Could you please test if lcd overdrive is active like it is at the SM or Clevo finally disabled it?

    Please run this test while the monitor is running with G-Sync enabled and max refresh rate:
    https://www.testufo.com/ghosting

    Please make a picture and if you can do a video (at least 60 fps setting on the camera) of the laptop screen where all 3 lines are visible. Please upload the picture and video somewhere where there is no quality loss.

    Thank you!
     
  21. Kp86

    Kp86 Newbie

    Reputations:
    0
    Messages:
    8
    Likes Received:
    23
    Trophy Points:
    6
    Correct. Same PL limits in performance mode.

    I've been mainly using throttlestop & haven't touched the cancerous OC area in CCC. Just using it for fan curves & RGB. I'm mainly gaming on it so I dropped PL 1 to 130 & PL2 150 in TS, disabled the power limiters etc. Undervolted -125mv on core, cache -40mv. Maintains all core 4.8ghz w/ ease, while keeping the 5ghz+ on lower core loads. GPU OC @ +170/+1200. Temps in GPU bound games on auto fans is around 56c-60c gpu/60c cpu. Noise has been tolerable, definitely doesn't reach max fan status.

    One thing to note, in GPU bound games, your CPU will drop to it's base all core clock due to how Dynamic Boost 2.0 works. So w/ mine (10850K) it's 3.6ghz (Shadow of the Tomb Raider) as it's pulling close to 165w on GPU (pegged @ 99% gpu utilization, 1950-1965mhz). For say cpu bound games like Battlefield 1/V multiplayer, the cpu doesn't drop & maintains my all core 4.8ghz throughout w/ GPU hovering around 150w (only was at 65-70% gpu utilization, game kept hitting the frame limit of 200fps @ max settings :)). GPU 55c/CPU 70-78c (cpu pulling 80-90ish watts) w/ auto fans. Max fans, the CPU wouldn't touch 70c.

    The 165w 3080 is definitely a beast in games, especially w/ RT.

    In my experience, OC to OC, the 165w 3080 is about 15% faster than the 200w 2080 Super in rasterization games @ 1080P, around 20% at higher resolutions. In RT games it's around 20% faster, more at higher resolutions.

    I planned to do a bunch of YT game videos, but that'll have to hold on until I get my replacement

    Here's a Port Royal benchie though, good enough for #1 laptop 3080 score :) (w/ normal game OC +170/+1200)

    https://www.3dmark.com/pr/981602

    Just the 144hz, which doesn't have that issue :)
     
    Last edited: Apr 6, 2021
  22. Clamibot

    Clamibot Notebook Deity

    Reputations:
    645
    Messages:
    1,131
    Likes Received:
    1,563
    Trophy Points:
    181
    That's the performance I expected out of the 165 watt 3080 since it's essentially a desktop 3070, which itself is about 10-15% faster than the 2080 super. Thanks for the confirmation.

    Is there any way to disable that CPU clock drop? That could have negative effects in anything that stresses both the CPU and GPU.
     
  23. Kp86

    Kp86 Newbie

    Reputations:
    0
    Messages:
    8
    Likes Received:
    23
    Trophy Points:
    6
    You can disable it in device manager under software services>nvidia platform controllers and framework. Then re-enable when you want it back :)

    Edit: BTW, it disables resizeable bar too. Though obviously that wouldn't matter in stuff outside of games, where both are being hammered.
     
    DaMafiaGamer and Clamibot like this.
  24. win32asmguy

    win32asmguy Moderator Moderator

    Reputations:
    1,012
    Messages:
    2,844
    Likes Received:
    1,699
    Trophy Points:
    181
    This should really be configurable on a per-game basis in Nvidia Control Panel. According to XMG "The +15W indication on GPU power represents NVIDIA Dynamic Boost 2.0 which only applies when the CPU power consumption is below a certain threshold." so the CPU should always win in CPU+GPU heavy scenarios.
     
    raz8020 and Papusan like this.
  25. Kp86

    Kp86 Newbie

    Reputations:
    0
    Messages:
    8
    Likes Received:
    23
    Trophy Points:
    6
    I can't imagine any gaming scenario where I've tested that I'd actually want it disabled in a game. If the GPU utilization is lower than 95%, it'll up the CPU speed some, if GPU is @99, the CPU will stay downclocked. When looking at the actual stats, it's one or the other. It's pretty spot on in everything I've tested. FPS is always higher w/ it on. Though I haven't tested it on stocks clocks, just my OC/UV settings. Obviously it would be ideal to have it at 165w regardless & just get rid of the +15 in general & have it standard, but we're stuck with this (for now). Outside of gaming, I'm not too familiar w/ a lot of the productivity apps, but I'm sure disabling it will come in handy w/ certain apps if both need to be pushed hard at the same time.
     
    Last edited: Apr 6, 2021
  26. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    The throttling boosting algorithm is not ideal in games that stress both CPU and GPU at the same time like Battlefield, CoD/Warzone, Cyberpunk, Assassin’s Creed, etc.
     
  27. Kp86

    Kp86 Newbie

    Reputations:
    0
    Messages:
    8
    Likes Received:
    23
    Trophy Points:
    6
    Worked great as I noted in my long post regarding Battlefield 1/V multiplayer if you mean in terms of getting the best FPS vs it off :). The framerates were hitting the hard cap limit of 200 (GPU utilization was around 70% because of this, around 85% when not hitting 200 fps cap) at max settings. GPU was already being bottlenecked in GPU utilization while being well under the 150W mark w/ CPU at an all-core 4.8ghz as well (80-90ish watts, not down clocking), it's a CPU bound game on the lappy 3080 @ 1080P. Boost 2.0 didn't need to kick in, since the CPU couldn't keep up even at full throttle.

    Warzone is extremely CPU bound as well this, the CPU runs full bore since the utilization on the GPU is low, thus the CPU runs full speed to try & get the GPU utilization up. It does what it needs to do.

    Cyberpunk is GPU bound, at least on my OC/UV settings w/ 10850K/3080. Pegging 97-99 GPU utilization at nearly 160-165w w/ max settings/DLSS Quality/RT Psycho, including driving. Because of this, the CPU down clocks to give more wattage to GPU.

    Assassin's creed is a terribly optimized game in the same fashion as Far Cry 5 on all systems. You could be at 95% utilization on GPU & use a lot less wattage than another game at the same GPU utilization :). Ubisoft really needs to fix their inefficient engines.

    The 10850K/3080 combo @ 1080P is definitely overkill for most games. You will be CPU bound (@ full speed) before needing that +15 on the GPU. Hooking up a 4K monitor? CPU will less of factor as you'll hit GPU bound finally requiring those 165w & you'll see nearly all recent games become GPU bound.

    Turning boost off in games won't increase your FPS w/ this set up @ 1080P. Just basing that off my experience. Not to mention it disables resizeable bar. It's on when it needs to be & off when it's not. The algorithm is surprisingly smart :)

    Not defending Nvidia implementing boost 2.0 vs just straight up making it 165w, but just some helpful info thought I'd share.

    Edit: Added a couple photos, same map, kept dying trying to screenshot :(

    BFV @ 1080P maxed out, multiplayer (64 player conquest)
    BFV @ 1080P maxed out, multiplayer (64 player conquest)
    BFV @ 4K maxed out, multiplayer (64 player conquest)

    [​IMG]
    [​IMG]
    [​IMG]
    [​IMG]
    [​IMG]
     
    Last edited: Apr 6, 2021
  28. DRevan

    DRevan Notebook Virtuoso

    Reputations:
    1,150
    Messages:
    2,461
    Likes Received:
    1,041
    Trophy Points:
    181
    I am a bit paranoid after confirming both the 240 and 300 hz panels to have that issue. Could you please do that test to confirm that the 144 hz panel is fine ? :)
     
    Kp86 likes this.
  29. 1610ftw

    1610ftw Notebook Evangelist

    Reputations:
    266
    Messages:
    462
    Likes Received:
    517
    Trophy Points:
    106
    Looks like it is about time that Clevo gets hold of those 165Hz QHD panels!

    I also noticed that it is not that hard to run the 10850k at 4.8 GHz on all cores with a bit of help from Throttlestop. Throttlestop showed a peak power consumption of up to 165W with the 10850K running on all cores in Time Spy.
     
    Kp86 likes this.
  30. Prema

    Prema Your Freedom, Your Choice

    Reputations:
    9,368
    Messages:
    6,297
    Likes Received:
    16,482
    Trophy Points:
    681
    This model has come a long way since December:

    https://www.3dmark.com/spy/16605937

    And it still has a long way to go...

    @DRevan

    Informed Clevo about the issue way back when you first experienced it, but given their track record in listening to feedback, I wouldn't count on it being fixed in the KM (there is at least no such toggle in BIOS or CCC).

    Edit: Sorry, don't have a high refresh screen to test this myself. Could someone please help this guy?!

    http://forum.notebookreview.com/thr...2m-owners-lounge.835639/page-17#post-11088323
     
    Last edited: Apr 7, 2021
  31. Cylix101

    Cylix101 Notebook Consultant

    Reputations:
    72
    Messages:
    236
    Likes Received:
    130
    Trophy Points:
    56
    So guys what do you think is the best bang for the buck for km? 10850 with the 3070? Looks like it will be hard to get a 3xxx in my P775TM, so i was thinking of selling my P775 and get a KM, still have time, want one with that 165 Hz QHD :)
     
    BrightSmith likes this.
  32. BrightSmith

    BrightSmith Notebook Evangelist

    Reputations:
    143
    Messages:
    640
    Likes Received:
    383
    Trophy Points:
    76
    If you're upgrading from a gtx1080 the 3070 makes sense. The 3080 won't give you much added value for the price imo. I wouldn't advise upgrading from a 2080, unless money is not a factor. Then I would go for the 3080 of course. If multithread CPU power is what you're after the 10900K will be King even in the Rocket Lake era.
     
    Cylix101 likes this.
  33. Cylix101

    Cylix101 Notebook Consultant

    Reputations:
    72
    Messages:
    236
    Likes Received:
    130
    Trophy Points:
    56
    Yeah from the 1080, yes the difference, money wise from 3070 to 3080 is big, to big for me anyway. Yes i need the threads so a 10th gen is the way, but is the 10900k so much better as the 10850k?

    Was thinking to buy it as a barebone, get it with the 3070, after buy a 10850 from ebay or from a good promotion, and il port some of the SSDs from my old p775, dont need to sell it with 4 ssds inside and even take the ram from the old one
     
    BrightSmith likes this.
  34. 1610ftw

    1610ftw Notebook Evangelist

    Reputations:
    266
    Messages:
    462
    Likes Received:
    517
    Trophy Points:
    106
    Sounds like a very good plan and if you are not into the absolute maximum performance you can get out of that generation then the 10850K is solid. When I got mine the 10900K was about 80 Euros more and with a lowly RTX 3060 I was not going to break any benchmark records anyway :)

    Also there seem to be certain cases where my 10850K goes down to its nominal speed without turbo so I would NOT go with a non-K CPU as that may mean 2.8 instead of 3.6 GHz which is not really that great.

    Please note that you will not be able to use 4 of the m2 form factor SSDs with a 10th gen. CPU as the fourth slot will only work with a Rocket Lake CPU. You probably know that there is no 2.5" slot any more but I mention it just in case.
     
  35. win32asmguy

    win32asmguy Moderator Moderator

    Reputations:
    1,012
    Messages:
    2,844
    Likes Received:
    1,699
    Trophy Points:
    181
    Thanks for the insight and additional testing. I guess what I am trying to find out is if Dynamic Boost 2.0 works differently in the X170KM than it does in all of the other BGA RTX 30xx laptops released so far. For instance, in this video you can tell that CPU TDP (not clockspeed or turboboost) is what controls GPU boost speeds. If it works the same way in the KM then I agree it should never need to be disabled, even in productivity apps.

    Although honestly any kind of CPU limitation is not necessary for the given GPU boost and goes against the spirit of the machine. I am guessing it just has to happen this way since Nvidia does not want anybody getting ahold of a signed vbios that can run at 165W TGP bypassing the boost system.
     
    raz8020, Papusan, 1610ftw and 2 others like this.
  36. S.K

    S.K Batch 80286

    Reputations:
    711
    Messages:
    1,212
    Likes Received:
    1,639
    Trophy Points:
    181
    The multi-threading Intel king is still 10900K. 11900K is a joke!
     
  37. S.K

    S.K Batch 80286

    Reputations:
    711
    Messages:
    1,212
    Likes Received:
    1,639
    Trophy Points:
    181
    Has anyone tried the new 3080 MXM gpu in their x170sm-g yet? Does it work with the prema mod bios or is there a new firmware needed to make it work? I am wondering if a gpu and heat sink only swap is possible on the previous generation x170sm-g.
     
  38. win32asmguy

    win32asmguy Moderator Moderator

    Reputations:
    1,012
    Messages:
    2,844
    Likes Received:
    1,699
    Trophy Points:
    181
    How is single-thread performance on a 10900K? I have a bunch of work projects that are not multi-thread optimized so they can benefit in some ways if the 11th gen is faster in that regard. We do software consulting so cannot really choose the projects. It's weird because the Tiger Lake 1165G7 28W CPU can actually run those projects better than the 10700K did.

    For games it does not seem to matter much between 10900k or 11th gen, either can keep up just fine. I also do like the idea of having the 4th dedicated PCIe slot working (so much that it seems like X170SM may be a better choice for pairing with a 10th Gen CPU).
     
    raz8020, Papusan and jclausius like this.
  39. 1610ftw

    1610ftw Notebook Evangelist

    Reputations:
    266
    Messages:
    462
    Likes Received:
    517
    Trophy Points:
    106
    You would probably have to check the specific programs you are running as i seems there is a pretty wide variance. From benchmarks it looks like single threaded performance can be anything between 5 and 15% better so probably in any case a better decision to take the easy route and go with the 11900K and being able to use that 4th superfast PCI-E slot if you are into that and get a modern GPU (3070 or 3080) without too much hassle and modifications.

    With everything being so hard to get at the moment the chances to get the SM-G at a good price with a new 30xx card and the new heatsink in a nice turnkey package seem to be rather slim compared to that.
     
  40. Entropytwo

    Entropytwo Notebook Consultant

    Reputations:
    70
    Messages:
    299
    Likes Received:
    216
    Trophy Points:
    56
    Something from xmg regarding the low watt 3080:
    Apart from a successor (or refresh) of the XMG APEX 15 with a maximum RTX 3070, no other model with a AMD desktop CPU is planned at the moment. There are several reasons for this, including the fact that none of the three big chipvendors (AMD, Intel, NVIDIA) is particularly enthusiastic about supporting such niche products. At Intel, support for the XMG ULTRA platform and its various predecessors (X7200, P570WM, P775TM etc.) at least has a long tradition and well-rooted networks in Taiwan and currently still relatively secure supply with CPUs and chipsets, minimising the risk of all those nice development costs being wasted by a paper launch.

    And yes, NVIDIA also plays a role here - after all, every GPU product has to go through the usual Greenlight process, and in the case of a notebook, the overall system (motherboard, CPU, power budget) also plays a role. Other ODMs (I'm not naming any names) have already tried to put systems with desktop CPUs (regardless of whether they are AMD or Intel CPUs) into development and haven't got very far despite already having finished, sensible designs for board layouts.

    On this occasion, one can also ask why an RTX 3080 in the XMG ULTRA 17 is limited to 165W TGP, when the RTX 3070 in the desktop is at 220W TGP.

    XMG ULTRA with i7-11700K and RTX 3080
    Unigine Superposition 4K Stress Test
    GPU Power (sustained): 160W
    GPU Clock (sustained): 1677MHz
    GPU Temp (sustained): 63°C

    As you can see in the GPU stress test, the graphics card with the cooling system of the ULTRA 17 still has a lot of thermal headroom: 63°C in the stress test with a maximum GPU temperature target of 87°C allowed by NVIDIA.

    The answer is as so often: the manufacturers optimise for mass, and the masses in the laptop sector want thin & light, so there is simply no board layout from NVIDIA for laptops in this generation that supports 200W. Whether this will change again in the future remains to be seen.

    Unfortunately, this is all much, much more complicated than one can (or would like to) imagine as an end customer, and unfortunately we can't talk about it very openly without breaking various NDAs.

    Cheers,
    Tom
     
    raz8020, Papusan and 1610ftw like this.
  41. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,188
    Likes Received:
    17,895
    Trophy Points:
    931
    Or the 11600k/10600k if gaming is your thing.
     
  42. sniffin

    sniffin Notebook Evangelist

    Reputations:
    68
    Messages:
    429
    Likes Received:
    256
    Trophy Points:
    76
    Tiger Lake uses Willow Cove cores, a gen ahead of Sunny Cove found in Rocket Lake. I think the new 8 core BGA Tiger Lake's will be much faster than the equivalent Intel desktop CPUs.

    I think it's a hard choice. On one hand, buying Skylake in 2021 seems utterly ridiculous - it is 6 years old :confused:. The latest mobile phone cores are creeping on it. On the other hand, it seems something was lost in the backporting to 14nm and Rocket Lake just isn't that impressive. Given how disappointing mobile Ampere is, I think it's better to just skip this gen altogether unless you're upgrading from something older then Coffee/Comet Lake and Turing.
     
  43. 1610ftw

    1610ftw Notebook Evangelist

    Reputations:
    266
    Messages:
    462
    Likes Received:
    517
    Trophy Points:
    106
    Goes to show that Nvidia / Intel are not the only bad boys here with AMD still refusing to support superior laptop designs. And Nvidia and Intel currently are effectively on their way to kill the remaining LGA / MXM platform, one company through sheer ineptitude and the other one by intentionally dumbing down mobile graphics cards.

    I hope that a year from now I will be proven wrong but at the moment things do not look too good with Nvidia preventing the X170 to make use of its superior cooling capabilities and who knows how good Elder Lake will really be when it gets released. The extremely underwhelming Rocket Lake launch for now looks more like an uncontrolled mid-air explosion...

    upload_2021-4-8_7-26-43.png
     
    BrightSmith likes this.
  44. 1610ftw

    1610ftw Notebook Evangelist

    Reputations:
    266
    Messages:
    462
    Likes Received:
    517
    Trophy Points:
    106
    Oh no, that started out really bad :)

    Thanks for trying to find worthwhile improvements despite Nvidia effectively sabotaging more capable and better designed laptops with lower TDP's and Dynamic Boost 2.0.
     
    raz8020 and Papusan like this.
  45. MD9787

    MD9787 Notebook Enthusiast

    Reputations:
    5
    Messages:
    23
    Likes Received:
    11
    Trophy Points:
    6
    Has the Benchmark been done with stock settings?

    @DRevan : I‘ll do the Test for you. Any wishes about settings? Or something to keep in mind?


    @ALL: anybody located the bios chips already? Tried to find it. Maybe i‘m just blind :)
     
  46. Entropytwo

    Entropytwo Notebook Consultant

    Reputations:
    70
    Messages:
    299
    Likes Received:
    216
    Trophy Points:
    56
    You can see drevans issue right in the tutorial area of skyrim, full screen plus gsync on. After you are set free(Dragon appeared first time) look over the mountains and see if you get purple hue
     
  47. BillR

    BillR Notebook Guru

    Reputations:
    3
    Messages:
    62
    Likes Received:
    28
    Trophy Points:
    26
    I'm buying for a dev workstation. i need the disk throughput and the reported 20% upgrade on the single thread. Looked around for a notebook with 5950 Amd & Raid disks with Clevo's capacities. I'm annoyed that i have to give up 2 cores, but I didn't see a better alternative.
     
    win32asmguy likes this.
  48. win32asmguy

    win32asmguy Moderator Moderator

    Reputations:
    1,012
    Messages:
    2,844
    Likes Received:
    1,699
    Trophy Points:
    181
    My only worry about Tiger Lake H is that it may be harder to get, more expensive, remove S3 standby support like Tiger Lake U, not have a good unlocked bios/ec option and be paired with much smaller cooling systems than the KM.
     
    Last edited: Apr 8, 2021
    raz8020, Papusan and 1610ftw like this.
  49. hacktrix2006

    hacktrix2006 Hold My Vodka, I going to kill my GPU

    Reputations:
    677
    Messages:
    2,183
    Likes Received:
    1,419
    Trophy Points:
    181
    @MD9787: anybody located the bios chips already? Tried to find it. Maybe i‘m just blind :)[/QUOTE]

    Give me a mainboard image of both sides and i will be able to find it, must be HQ images though so i can really zoom in.
     
  50. DaMafiaGamer

    DaMafiaGamer Switching laptops forever!

    Reputations:
    1,286
    Messages:
    1,239
    Likes Received:
    1,635
    Trophy Points:
    181
    @Kp86 just wanted to ask you if your RTX 3080 has the same hardware ID as mine (you can check in device manager).

    Mine is as follows:

    PCI\VEN_10DE&DEV_ 249C&SUBSYS_16021043&REV_A1

    Does your RTX 3080 also have 249C as the identifier? If so I could potentially use that 165w vbios :D
     
    raz8020 and Papusan like this.
← Previous pageNext page →