The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
I don't mind NGFF M.2 at all. There is nothing wrong with M.2. I don't like when it is used as an excuse for making a smaller piece of crap laptop though. I'd prefer to see a chassis like the P870 that uses the space the two 2.5-inch drives used to occupy for 4 or 5 M.2 SATA SSDs (in addition to those M.2 slots that are already present) rather than reducing chassis size. I mean, who wouldn't love having like 8 M.2 slots, with 2 or 3 being NVMe 1TB SSD and the other 4 or 5 being 2TB M.2 SATA SSDs?
It's unfortunate that too many consumers are eager to sacrifice performance and functionality for the sake of having something that is smaller and lighter. Even more unfortunate... actually tragic... when those are the only options that remain for those who are not eager to sacrifice anything and prefer a monsterbook over a chintzy lightweight piece of trash.
The concept of intentional under-engineering seems like it is here to stay. The thrust seems to be the idea of building everything to be almost adequate for ideal conditions and not care about what happens when conditions are less than ideal, or when the owner intentionally pushes his system harder than an average consumer. It also sucks when they use cancer firmware to cap performance to avoid having to build a system capable of
comfortably operating the CPU and GPU, simultaneously, at their intended maximum stock performance sustained over a period of many hours. And, the notion of making excuses for a product on the basis that it is "only a laptop" deserves an extremely severe punch in the face, followed by multiple kicks to the groin.
The heatsink cooling capacity should be built for 200w graphics and Pl2 limits for the 10 cores chips (250w). Minimum 450w with the normal 10 to 15% thermal capacity headroom above. Aka around 495/500w cooling capacity at minimum. 340w is under-engineering!
Yes, I agree. And, if they were going to do a really fantastic job that demonstrates pride and a passion for excellence, it would be built to keep those parts abnormally cool (by notebook air cooling standards) at a sustained power draw of 650W to account for overclocking, or using the machine in a hot environment. Overclocking aside, not everyone has the luxury of staying in a ritzy climate-controlled environment at all times.
Temperature environment spec range for computers is normally from 0°C to 35°C. Clevo can't design cooling capacity for the lower range. If trey did we would see RMA numbers go to hreaven. Here home I have 19/20C as target 24/7. This means I will have a headroom for 15C lower temp than the upper end of the max allowed temp. Fine tune the cooling and I can probably shave of 5-10C degrees more.
Unified heatsink
Just remember... no load on the graphics
I just dont believe that heatsink will keep the i9 cool. Proper air coolers in a desktop PC are gonna struggle with this chip let alone something that borders on being called a laptop.
I don't anticipate having a surprised look on my face when that happens. Again. And, again. I'd probably have to pinch myself if the opposite were true.
Predictive modeling is important to maintain consistent outcomes. We can rest assured in the fact that it does matter how few or how many watts it uses, the goals are the same regardless of TDP: (1) load temps will be ~100°C; and, (2) there will be thermal throttling.
The people that design laptops are predestined to meet their goals. Failure is not an option. It is their life calling and they are very good at what they do.
We should never question the outcome. It functions as intended.
I also prefer smaller laptops, PCs and any other things but not with some critical sacrifices like no 2.5 drives or everything BGA (even RAM, SSDs - this is ridiculous).
It's not just tripod, this crap heatsink is a pain when you tend to do lapping.
Did this lapping in the AW13R3 and it was a pain and a risk not to kill the CPU/GPU die in the process of lapping. Added that 4th extra rib on the CPU side, it helps to keep better contact with CPU die.
You pay over 1000$ or maybe even more to have such bad cooling in your machine? - this is unacceptable.
With good m2 Sata drive sizes, proper cooling and 7 to 8 slots I would be OK with that but we are talking about a lot of things that simply nobody ever did before, on top of that
- NVME SSD's are very expensive compared to standard m2 Sata drives and 2.5" drives
- the cheaper m2 Sata drives only go up to 2TB
- You will not get manufacturers to even go up to 6 x m2 form factor drives, the max seems to be 4 from what I can see. Add to that the fact that cheaper 2.5" SSD's with 4 and 8TB are available and that m2 Sata goes only up to 2TB and you can see that things either will get expensive or space will be limited compared to what we had before. I also do not like the fact that m2 form factor drives have to be cooled properly yet we often do not even get proper heatsinks for them, it is nicer to have the heat sink built in as many manufacturers do not really care to address that issue either. I would however say that this is more of an issue for DTR use, especially when the X170 would be used for video editing.
LOL @ punches and kicks, that seems a bit extreme
I would like to think that we will see a leap in flagship laptop designs when the next Nvidia generation becomes available and either Intel moves to less than 14nm or AMD to higher max clock speeds. Until then you are absolutely correct and the P870 really seems to be the last no compromise design that also had the bios needed to make it work.
As for the X170 I look forward to see some real world power draw under sustained load - without external cooling and with at least 20 degree room temperature I would be very surprised if we can achieve more than 400W sustained power draw without severely compromising life expectancy of the chassis.
I would like to see some numbers for power consumption with all cores running starting at 4 GHz and going up to at least 4.8 Ghz. And air cooling only please in a room with at least 20°C.
I doubt that even 10 x 4.8 GHz is sustainable for long in such a setup.
So for some perspective, Lummi ran 5.4Ghz R15 on his 9900K R0
on water and validated 5.8Ghz. Now compare it to -20c needed for 10 core at 5.4Ghz, and you begin to see the exponential power consumption and heat with the two added cores on 14nm. YIKES
Cause AMD is midrange tech compared to Intel apparently so they don't deserve the good NVidia GPUs.
If I want to pair a Ryzen 3 with an RTX 2080 then let me damn it! (Im honestly wondering if those new R3 3000 chips would be fine with an RTX 2080...)
No I meant that the closest amd variant doesn't even have a 2080, your stuck with a measly rtx 2070. Nothing higher than that available. I don't know who came up with the laptop lol, a few more heatpipes and you could cool a 150w tdp rtx 2080. Instead they went with a 2070
oh well, with a shunt mod you could get to the 2080s performance level so there is that.
BTW the RTX 2080 Super will be mxm, at least clevo is doing something right!
1) we have a (choice of) LGA CPUs
2) we have an RTX 2080 Super, fitted on a "MXM" module (is it still a proper 2080?)
3) we have a 4K screen (but what is the panel?)
4) we have advanced cooling
So in theory this laptop is the king of all laptops at this moment (for gamers) ? It is much better than the A51m on paper.
The X170 is held to a higher standard than others as it basically is the successor of the mighty P870.
I would say yes to the first two points and remains to be seen to the other two
Somehow my aw 51m survived a two day furmark test, I wonder how
@Papusan
maybe luck is on my side.
But yeah this clevo will slap the alienware 51m around. Clevo's vrm design and choice of mosfets make it 10 times superior to dell. Wanna know the worst brand for being reliable? It's not Apple lol its Razer
and still people blindly buy that garbage. That is straight bga trash worse than Apples Macbooks for sure!
Nah, the 3990x on turbo uses around 350W IIRC and overclocked easily can go to 800W then on liquid nitrogen goes to over 1KW, makes this look like childs play.
That's still 3W over that i9 at 5.1Ghz. The power requirements of that i9 is nutty and rediculous. I just hope Clevo wise up and make a proper Ryzen version of this X170.
Actually it is both as it is now the best Clevo that one can buy which before was the P870. It is also true in so far as in this thread it is repeatedly (unfavorably) compared to the P870 and not to the P775.
The concept of superior cooling capability would not only apply to SLI graphics these days but also to current power hungry CPU and GPU chips - 500W is 500W, no matter if the power is consumed by an SLI model or a model with only one 200W+ GPU with a 10+ Core procesor that may need more than 300W in certain situations.
You mean for the people that do not enjoy overclocking, right? Before AMD made good CPUs their garbage CPUs were fantastic at overclocking. They just didn't offer any respectable performance regardless of clock speed.
Their GPUs have sucked at overclocking for about a decade. If you enjoy overclocking, then AMD is not the right brand for the job. They can barely achieve clock speeds at sub-zero temps on LN2 that Intel CPUs can do on water.
I'm not saying Ryzen is no good, only that it's an insanely boring product if you enjoy overclocking. If you just want to run it stock and play games, it'll be fine. If you want to build a server and save money, they're fine for that, too. Makes a lot more sense for that financially.
Agreed, Intel is still the king of maximum clock speeds for gaming where maximum speed on 8+ cores is not needed and they are not that hard to overclock. This is also why I see a legitimate use of Intel chips in high end gaming laptops.
Even if Intel would not monopolize high end Nvidia cards I would not expect that too many manufacturers are willing to go with both Intel and AMD CPU's in their high end laptops simply because sales will not be that high in that segment and therefore too low to justify both a high end Intel and AMD solution.
I wonder what the AMD fanbooys drinking right now... The AMD Kool-aid? All bash Intel, but forget that AMD ain't any better. At least you'll get Rocket on top of Comet lake with Clevo x170 and Z490.
4th generation Ryzen desktop processors will only support AMD 500-series (or later) chipsets. The next-gen processors will not work with older 400-series chipsets.
Considering just how different the CPU architectures have been, it's amazing the compatibility has been what it has. It's like a core 2 and the first I7 being able to go into the same motherboard.