The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    *** Official Clevo P640RE/Sager NP8640 Owner's Lounge ***

    Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by NordicRaven, Sep 15, 2015.

  1. Prema

    Prema Your Freedom, Your Choice

    Reputations:
    9,368
    Messages:
    6,297
    Likes Received:
    16,482
    Trophy Points:
    681
    Only the new P7 and P8 have TB3...
     
  2. Megacharge

    Megacharge Custom User Title

    Reputations:
    2,230
    Messages:
    2,418
    Likes Received:
    14
    Trophy Points:
    56
    I'm in the market for thin/light, so those do nothing for me. I don't get how they haven't included this on all models, especially the smaller more portable ones. This seems backwards to me.
     
  3. darkarn

    darkarn Notebook Evangelist

    Reputations:
    47
    Messages:
    655
    Likes Received:
    226
    Trophy Points:
    56
    Thinking about this on a larger scale, TB3 is kinda niche actually. Not surprising that only the big boys get all sorts of stuff including TB3. Being small means that Clevo has to prioritise which features are the most used, and therefore should be included, which is not TB3. But of course, being big has their own issues...
     
    Ionising_Radiation and jaybee83 like this.
  4. Megacharge

    Megacharge Custom User Title

    Reputations:
    2,230
    Messages:
    2,418
    Likes Received:
    14
    Trophy Points:
    56
    It's going to be mainstream by January. It also makes more sense that if they had to choose which models to put it in, that it would have gone in their thin/light models first, rather than big bulky desktop replacements, for obvious reasons. Regardless TB3 will be extremely useful for me, and I'd imagine others as well, and as a Clevo fan, this angers me and forces me to go another route which I didn't want to do. :mad:
     
    Last edited: Oct 3, 2015
  5. darkarn

    darkarn Notebook Evangelist

    Reputations:
    47
    Messages:
    655
    Likes Received:
    226
    Trophy Points:
    56
    Hmm. I am not sure how/why will it be mainstream. Can tell me more?
     
  6. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    chillax and take a deep breath @Megacharge :D tb3 is so super brand new, how the heck is it supposed to be mainstream already across all laptop models? ;) as with usb 3.1, only highend models will adopt these new tech standards and only AFTER theyll slowly diffuse into the mainstream market. january still seems a tad optimistic, to say the least... heck, intel hasnt even officially certified the tb3/usb 3.1 combo port yet! :D
    its like saying: oh hey, they JUST released the first and ONLY 17 inch 4k screen with 100% gamut, i NOW expect ALL clevo models to adopt 100% gamut with 4k and gsync by next week or ill be very angry! /rant

    get my point? ;)

    Sent from my Nexus 5 using Tapatalk
     
  7. Megacharge

    Megacharge Custom User Title

    Reputations:
    2,230
    Messages:
    2,418
    Likes Received:
    14
    Trophy Points:
    56
    A 14" laptop that's a bit less than 5 pounds and is 1" thick which sports quad core skylake and a 970M isn't high end to you? It's pretty much the best laptop in that size/weight range, "if" it had TB3.

    And you guys don't see the importance/more usefulness of it being in a more portable laptop before a desktop replacement?
     
  8. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    sure its "higher end" but there are still more than enough more powerful models above it, that was my point :)

    well it depends how u look at it. u might as well argue than a "thin/light" machine just doesnt have the necessary cpu power to drive an eGPU. besides, since ull already be using an eGPU and thus be forced to stay stationary, why would u be bothered by a big n heavy machine? its all in the eye of the beholder i guess :p
     
  9. Megacharge

    Megacharge Custom User Title

    Reputations:
    2,230
    Messages:
    2,418
    Likes Received:
    14
    Trophy Points:
    56
    Why would I argue that? That would be silly considering the first page of this thread. Obviously those processors wont have an issue driving even the most powerful of desktop cards. On top of that, the whole point is the purpose between thin/light as opposed to DTR. I could argue, why even need thunderbolt if you're going to have a massive DTR that rarely leaves the desk and already has desktop power. Whereas the 14" doesn't, and would appreciate TB3 far more than the DTR would. Obviously both could use it, but logically, in the short and long term, the smaller laptop will get much more benefit out of it, and that's what makes it a smarter, more logical choice (if a choice must be made) to implement it on the smaller, less powerful laptop before a DTR.
     
  10. DataShell

    DataShell Notebook Deity

    Reputations:
    47
    Messages:
    777
    Likes Received:
    354
    Trophy Points:
    76
    There are other uses for TB3 besides PCIe connections for eGPUs...
     
  11. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    i meant "you" as in generally speaking. u were asking why they wouldnt implement tb3 on thin/light laptops, so that would be a point to make for that :)

    also, why would it be "obvious" that the mobile BGA crap cpus would be sufficient to drive desktop gpus? its already been shown that the 980M is being held back by regular HQ cpus, so i highly doubt that a fullblown desktop gpu would be fully unleashed.

    aside from that: there arent even any official TB3 eGPU casings around, so ud still have to mod yourself through to a solution :p for now, tb3 is just a gimmick and nothing more, until actual egpu solutions are presented by manufacturers :) so this argument is pretty moot, anyways :D
     
  12. Megacharge

    Megacharge Custom User Title

    Reputations:
    2,230
    Messages:
    2,418
    Likes Received:
    14
    Trophy Points:
    56
    That goes without saying.
     
  13. Support.1@XOTIC PC

    Support.1@XOTIC PC Company Representative

    Reputations:
    203
    Messages:
    4,355
    Likes Received:
    1,099
    Trophy Points:
    231
    Absolutely there are. I think people get most excited about eGPUs just because of the possibilities it might be able to offer at some point. I would agree that it would be nice to have it on some of the slim or less expensive models, and not just the desktop replacements though. It would be some what of a niche product, and it is all just speculation on how practical it would actually work. But it's a new possibility, and who knows where it could end up in a couple years, so I think that is why people get excited about it. I would personally love to have a universal eGPU that would work with about any model, and not have a proprietary connection type on it that is only usable by certain manufacturers and models.
     
    DataShell likes this.
  14. darkarn

    darkarn Notebook Evangelist

    Reputations:
    47
    Messages:
    655
    Likes Received:
    226
    Trophy Points:
    56
    Hmm... Still not sure what are they. What else can they be?

    Good point, now eGPU don't seem so attractive. I wanted it as a contender to the Gaming Dock or the Graphics Amplifier; have "good enough" graphics on the go while having desktop graphics when at desktop, but now with the CPU bottlenecking... Back to the drawing board?
     
  15. Prema

    Prema Your Freedom, Your Choice

    Reputations:
    9,368
    Messages:
    6,297
    Likes Received:
    16,482
    Trophy Points:
    681
    No question TB3 on this system (let's call it "WAVE" ;) ) would have been nice.
    I am sure it was scraped for financial reasons on the P6 models as the licensing fees to Intel and other involved parties couldn't justify the projected number of extra sales they would have gotten by implementing it.

    Something to cheer you up >>> There will actually also be an option for i7-6820HK in the P640RE. :)
     
    Last edited: Oct 3, 2015
  16. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    P6 with 6820HK? now thats a nice combo :)
     
    TomJGX likes this.
  17. Support.1@XOTIC PC

    Support.1@XOTIC PC Company Representative

    Reputations:
    203
    Messages:
    4,355
    Likes Received:
    1,099
    Trophy Points:
    231
    TB hard drives for faster data transfer speeds. You should be able to daisy chain some monitors off of one port. Other peripherals with TB connections. There isn't much that uses TB, but there are some people that do specifically look out for models that have it available.
     
    darkarn likes this.
  18. darkarn

    darkarn Notebook Evangelist

    Reputations:
    47
    Messages:
    655
    Likes Received:
    226
    Trophy Points:
    56
    Ah I see... I heard about some video editing peripherals that needed TB too.
     
    Prema likes this.
  19. Bullrun

    Bullrun Notebook Deity

    Reputations:
    545
    Messages:
    1,171
    Likes Received:
    494
    Trophy Points:
    101
    Hopefully, TB3 will take off. But I don't get the "make or break" opinion. What peripherals are available now? How many options did the P37xSM owner's have with their TB port? What are the prices like? USB type-C, USB 3.1? These will take off because it's the next iteration of a popular, well supported port. But what's available today? The Type-C connector may be the best hope for TB3
    IF this happens, then get upset if Clevo leaves it off a model. I understand them including it on their high-end models only. It's still a gamble but they won't get caught with their pants down.
     
    jaybee83 likes this.
  20. Ionising_Radiation

    Ionising_Radiation ?v = ve*ln(m0/m1)

    Reputations:
    757
    Messages:
    3,242
    Likes Received:
    2,667
    Trophy Points:
    231
    @Megacharge, mate, you've got to calm down just a bit.

    Not everyone wants a giant GPU plugged into a small laptop - that's the whole point of getting a small laptop, isn't it? And as @jaybee83 rightly pointed out - TB3+USB3.1 is very new. It's going to take a while before such ports filter down to thinner notebooks like these.
     
    s19 and TomJGX like this.
  21. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,189
    Likes Received:
    17,900
    Trophy Points:
    931
    There is the extra chip to add TB3 this requires extra board space and cost that is not always available.
     
  22. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    Currently the GS60 looks like the best bet. Short of knowing what Gigabyte, Aorus and Razer are up to with their machines.

    The issue I'm grappling with, is the ever closing gap between the desktop and mobile CPUs. Coupled with the fact that in gaming (arguably the primary purpose of these laptops) that difference means even less. That being said, it does have to function for work purposes (half-decent battery life, relatively light etc). The possibility of using a laptop + eGPU to function as a gaming rig at home is VERY appealing. Especially for those who've been waiting to upgrade to Skylake from older CPUs. My desktop is a Sandy Bridge i7-2600K clocked at 4.4ghz and the current mobile CPUs pretty much match it anyway.

    I'm looking to replace my current laptop with something for probably 2 years. A thin laptop with TB3/eGPU holds enormous value because it buys significant longevity. If CPUs only increment at the current super-slow rate, then a Skylake laptop could very well replace my desktop at the same time. Without the possibility of eGPU, not so much.

    EDIT: I've been a Clevo/Metabox fan for my last 4 laptops now. But with an aging desktop this could very well be what sways me.
     
    ghegde likes this.
  23. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    In what way?

    There is a GROWING gap, not a shrinking one. We had nonexistent gaps with Sandy Bridge, Ivy Bridge and Haswell socketed mobile CPUs. We now have huge gaps that are likely only going to get worse and worse; Skylake's line is the slowest line I've seen. Haswell's weakest non-lower-power i7 HQ chip was (assuming 4-core load) 3.2GHz with turbo. Broadwell's was 3.5GHz with turbo. Ivy Bridge's 3610QM was 3.1GHz like the 6700HQ, but the 3630QM was much more common and clocked higher.

    Also, the 6820HQ is only 3.2GHz and the 6920HQ is only 3.4GHz. We don't know if the 6820HK will hold its TDP under long load times so its overclockability might be a moot point; and it's 3.2GHz base only.

    That's going backwards.
     
  24. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    For the GS60, it's one of the only "thin" laptops currently available which has had a USB3.1/TB3 combo port added to it.

    As for the CPU gap:
    Depends what you're comparing and testing with (ignoring super high TDP Socket R and such). Most people are comparing the 6700HQ with the 6700K (which is all that was available for a time) which is an unfair comparison. The rated TDPs are entirely different and there's no way around that. However, the standard i7-6700 is very similar in performance to the i7-6700HQ (assuming the mobile one is in a chassis which can hold it's turbo accordingly).

    You also conveniently ignored the part where under gaming load the gap, even where there is one, is almost useless.

    My current 2600K @ 4.4ghz can match a stock standard i7-6700HQ in some tests and gaming loads. It's currently paired with a 980Ti and doesn't hamper it in any way (I've yet to find a game that really holds it back).. But it does so at a massively higher power budget. The 6700HQ either beats my old CPU or falls short by as at most 5% or so with 1 third of the power.
     
  25. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    P640RE doesn't? I thought it does.

    Incorrect. i7-6700HQ is 3.1GHz 4-core turbo. i7-6700 is 3.7GHz 4-core turbo. That's a huge performance difference. The 6700K is also 4.2GHz turbo.

    Let's check haswell's gen. i7-4710MQ was 3.3GHz. i7-4790 was 3.8GHz, so a shrink. Also 4810MQ was 3.6GHz, and was OCable to 4GHz, which made it better than a 4790 theoretically. 6820HQ is 3.2GHz alone, and its TDP lock means that even if you could shove it to 3.6GHz, it isn't likely to hold that clock nearly as well as the non-TDP-locked MQ chips could (in good boards). Plus, comparing the K chips to the K chips... the 4770K was 3.7GHz with turbo by default and the 4930MX was ALSO 3.7GHz by default... both fully unlocked.

    So no. We're getting worse. A lot worse, in fact.

    I didn't ignore any gaps... I play at 120Hz and my 3.5GHz i7 can limit me. Like this. Just because YOU don't see bottlenecks too easily doesn't mean I am in the same boat. Also, a 4.4GHz 2600K is better than a 3.1GHz 6700HQ. By quite a large margin, in fact. You would need to clock your chip to a paltry 3.9GHz or so to match the 6700HQ's performance on average. And the 6700HQ, assuming TDP limitation like other HQ chips, will likely be easily limited in heavy rendering tasks or things that go beyond the low-power environment of gaming.

    So... no. I don't think we're closing any gap. I think gaps are getting further. Much further. And the BGA-only nature of mobile chips is making it worse, coupled with the TDP-lock that's most likely built into chips themselves. You'll absolutely under no circumstances ever convince me otherwise... it's literally impossible to factually persuade me here.
     
  26. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    Nope, doesn't. Prema confirmed early in the thread.

    On pure clockspeed, yes there is a difference if you talk about CPU benchmarks. But as I said, such bottlenecks are quite rare.

    Dying Light is somewhat of a bad (good?) example because it has serious CPU usage issues. Even a 5960X won't push that average frame-rate much above 120fps. It's a zombie game ffs, literally the dumbest AI routines that can be programmed, makes you think where the hell the CPU cycles are actually going....

    I guess, for me and probably many other gamers, CPUs have simply hit an "acceptable" point. Very few programs actually cap out due to a CPU bottleneck. From there, any task that really needs that extra CPU power will be calculation heavy tasks which take hours and probably preferable to offload to a desktop or server in some manner. For all others, I guess there's the P7 series.

    Fact is, you'll never get what you want. Desktop cpus target 100W. Mobile chips target <50W. Those 2 will likely never converge because we've pretty much determined the practical limits of cooling the CPUs in laptops. If you keep using the desktop processors as the benchmark of comparison of your laptop, you'll likely be forever disappointed.
     
  27. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Oh. That's sad.

    Rare to you, as I said. Single-threaded games often give me issues holding 120 and multi-threaded games do too. 60Hz is a different story... but I shouldn't have to downgrade my expectations because I'm on a laptop.

    Want an example from GTA V?

    Yes. For you and many other * GAMERS* it has hit an "acceptable" point. However as a livestreaming master *points at title* and as someone who's done rendering quite often on this machine, I can tell you that a HQ chip would have had me returning the machine about 300 times until I got a golden CPU due to TDP limits until I could undervolt it enough to almost never cross the TDP lock. And lowering clockspeeds to hit a lower TDP limit is not the way to "progress" for the mobile market, but I suppose that way CPUs won't likely have it said that they "TDP throttle" as often.

    I won't? Are you sure about that? I'm complaining because we *WERE* at the point of greatness, and have backpedaled. As I said: it's not possible to factually change my mind.

    Let me put it this way: the thought of "well if you want more power" or "if you want to overclock" to "just get a desktop" is the reason why we're here right now. The people who could vote with their wallets don't care that the new laptops are inferior. At all. They're HAPPY for the inferiority, and soldered components. Why? Because now they have something super super thin they can play around with. And so the "market" said "it's fine to go backwards" because we're at "an acceptable level".
     
  28. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    And here we have the crux of the situation.

    This mentality is simply unrealistic. They simply cannot jam 100W of processing power into a thin laptop (we are in the P640RE thread here...). Expecting to do so is just unrealistic. It's simple physics.

    That's an Alienware 18. In that size field you can get EXACTLY what you want in a P870DM now (desktop processor and desktop 980 or 980M-SLI). We're talking about a thin 14" machine here where for MANY cases the 6700HQ is more than adequate.

    Again....P640RE thread. 14", thin, portable gaming machine. Not a desktop killer. The P640RE (and any other 6700HQ+970/980M machine) is not supposed to supplant a super powerful desktop. That being said, I reckon it'd give your typical mid-range i5+960/970 desktop which many would own a serious run for it's money.

    "Thin" is very much something that many here will find useful and has value in and of itself. Personally my laptop has to be portable enough for work purposes as well. I've already got to carry a myriad of tools, cables and other gear and such. I imagine many people also take these to uni or meetings where they don't want to carry a tonne of weight. That's the trade-off that's made. However, the loss taken by going thin is getting smaller and smaller.
     
    s19 likes this.
  29. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    The CPU cooling on a P6xxSx and P6xxRx is actually at worst equal to (or possibly) stronger than mine (P370SM3) that was used for the Linpack screenshot I showed you. You are capable of heavy loads with a properly polished heatsink and your fan on max with good elevation (for airflow). Even in those "thin" machines, you can get decent performance. Besides, aren't those "overclockable" 6820HK chips being sold with the P6xxRx models? =D.

    Well yes. An AW18 was indeed what I showed with the overclock. But what about the 17" models that are single GPU? Or the 15" ones, that some people have used to run 4GHz+ CPUs? The huge cooling that's in the desktop systems is because the desktop CPUs generally push out MORE heat than the mobile ones. 4GHz mobile isn't difficult to have even with this 14". In fact, considering how cool maxwell GPUs are, I'd say making a decent triple-pipe heatsink and putting the GPU in the slot the CPU is in and having the CPU use the dual-fan heatsink area would be a far better way of dealing with thermals; almost everyone in these threads came to that conclusion when the P6xxSx models released.

    I didn't claim that I considered the P640RE to be a desktop killer. In fact, I haven't mentioned much about the P640RE with respect to power at all. What I claimed, and what is still true, is that mobile CPUs are WORSE OFF than they were 2-4 years ago. And that is in fact true, as I have proven. The fact that we're FORCED to use mainstream desktop CPUs for real power means that your choice of notebook as a power user or a business is EXTREMELY limited. The people in THIS THREAD are great examples of why that is a problem.

    I don't claim thin machines to be a bad thing and I believe quite a few people like them and have genuine need for them, even. But to shove way too much power in them to be cooled (like Gigabyte laptops and Razer Blades and MSI's GS line that EASILY overheat the majority of their CPUs) and having the media and public say "well it's a laptop, obviously it'll throttle... but look how much they got to fit in there!" is the opposite of help.

    Need something sub 1" thick and 4 pounds or less? Get a ULV CPU and a midrange, cool card like a 960M. Need power? Deal with the blasted 1.5 pounds extra and 0.24" extra and get a properly working machine.

    The laws of Physics does not bend backwards because somebody wants a thin laptop with power in it. The P6xxSx and P6xxRx laptops are fantastically designed for the most part and that includes this P640RE. They're much lighter than anything Clevo has done before with this level of power and they keep cool and don't throttle you unless it's out of their hands (like Intel's built in power throttle does). Know what doesn't get that praise? Alienware's entire current lineup, which throttles peoples' CPUs to their non-turbo ratios for no reason. Lenovo's machines which lock power and kill CPU turbo if CPU and GPU are stressed because they didn't design their cooling system well enough. Anything razer has created since 2014. Anything with the name "Aorus" on it. Etc etc.

    I'll say it again. NOBODY should lower their standards "because it's a laptop". That's the wrong way to go, and if you disagree, you're flat out part of the problem. Demanding quality for our money is not "bad". It is not being "picky". You want me to pay 2x as much for my CPUs and 3x as much for my GPUs? You had better flipping present me a machine crafted by the forge at Mt. Olympus with materials provided by the mines of Asgard and powered by Thor's lightning bolts itself. And this whole "crafting them to limit performance artificially" needs to never be a thing again.
     
  30. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    Stopping this here. I'm not going to get into a multi-page long battle over a model which we haven't even seen yet, over topics that have diverged from the P640RE and with someone who clearly has a different and unwavering opinion. My original comment was made with gaming in mind.

    I'm genuinely interested to see this model but the hit in battery capacity, (possible) lack of G-Sync option as in the P650 and lack of USB3.1/TB3 makes it a hard sell for me: a gamer that also needs some semblance of mobility given the existence of the GS60.

    EDIT: I'm all for a good discussion/argument. But better keep things on topic. Far too many model threads get bogged down with this kinda stuff. Better to PM or start a discussion thread somewhere.
     
    Last edited: Oct 5, 2015
  31. Ramzay

    Ramzay Notebook Connoisseur

    Reputations:
    476
    Messages:
    3,185
    Likes Received:
    1,065
    Trophy Points:
    231
    I don't think its about lowering standards - it's about being realistic in regards to what to expect from a thin & light machine.

    I know that there's only so much you can fit into a small machine before thermals and noise become an issue. So yes, in a sense I do "lower my standards", because I don't expect more performance from a machine the size of the P640RE than what could reasonably be expected. Even if it does have a GTX 970M + i7 CPU, I fully expect there to be some amount of thermal throttling to keep noise and heat levels in check. I don't expect an I7-6700HK + 970M in a 14" thin machine to provide the same levels of performance as the same components would in a big 17" chassis.

    Is that lowering my standards? No, it's being realistic. I don't expect a Toyota Prius to win the Grand Prix. I don't expect a fat-free/sugar-free cheesecake to be quite as delectable as the full-fat version. I know what I'm buying when I get a small/thin machine.

    Now, there's a valid argument to be made as to why they put CPUs/GPUs in a chassis that can't handle the heat and noise. Probably marketing.

    And besides, you DO have to "lower your standards" when it comes to laptops - from a certain point of view. Size constraints means something's gotta give, period. It'll either be heat, noise, throttling, power, something. You can't fit the same components you would find in a full-size ATX tower with full water-cooling into a 15" laptop chassis and expect performance to be 100% identical. And since heat & noise are a major concern to the vast majority of laptop users (can't use it on your lap if its roasting hot, and can't use it at work if it sounds like a jet engine) concessions are made in regards to TDP/power etc.

    All that being said, I agree with your sentiment that I feel is lurking below the surface - people are asking to have their cake and eat it too (aka super powerful thin 14" laptop that's just as beastly in performance as an 18" monster DTR). Not gonna happen (not yet anyway), but companies keep making and marketing those laptops, people buy them, then complain when they throttle/get hot enough to cook their breakfast/loud enough to wake the dead. You want a thin machine? You'll have to make concessions in terms of CPU/GPU power. You want balls-to-the-walls power? You'll have to make concessions in terms of size/weight (maybe even heat/noise).
     
    Last edited: Oct 6, 2015
    s19 likes this.
  32. FLAT EARTH

    FLAT EARTH Notebook Geek

    Reputations:
    5
    Messages:
    88
    Likes Received:
    35
    Trophy Points:
    26
    I just wanted to say I can totally understand where megacharge is coming from but since I already have a relatively thin laptop I decided I wanted a bump in power and potential for upgrading so I got into the clevo p750 ZM and may eventually buy a DM its successor. I also wanted the skylake p6's with thunderbolt 3 but its not happening yet.

    However if your looking for a reasonable priced and sized laptop with tb3 I recommend this one its the lowest priced tb3 laptop I could find in a laptop weight class:
    http://us.msi.com/product/nb/GE62-Apache-Pro-6th-Gen-GTX-960M/#hero-overview
     
  33. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Lowering standards and being realistic with expectations are two different things, and ALL my posts were being pretty specific.

    I wasn't for almost all of my book-like posts on the last page talking about the P640RE, but rather I was claiming (correctly) that the gap between laptop CPUs and desktop CPUs has widened considerably, rather than "gotten closer", especially in options available for power users and overclockers. I also do not expect a 14" thin machine to provide the same cooling as a 17" thin machine can... I'm not stupid. But my statement about the CPUs was not in reference to a machine, but to the CPUs themselves. My only statement about the P640RE was that it should be theoretically possible to get CPU cooling as good as mine is in it, which is also likely true considering heatsink design and fan size. The P370SM3's CPU heatsink is really weak, and there's only a single small fan for it.

    Again, lowering standards is machine-agnostic. I want the HARDWARE to be worth its salt. Do I consider the 980M worth its salt like the 780M was? Not even close. The 965M as well suffers from the same problem the 960 does: severe lack of memory bandwidth. But its core is a lot weaker, and like the 860M/960M/750Ti, it's near impossible to run into a scenario where memory bandwidth bottlenecks the card in a game. 970M is a fine card all things considered, but in the entire scheme of things for Maxwell's line, it's a weak card, and the only extra mobile card they've made is the mobile 980. My only real complaint is that they didn't seem to be able to make the lower-TDP MXM-sized card able to fit in most of the older machines, but in reality it seems to be well-done otherwise (for once)... despite it's ridiculous price.

    It's not standards. I expect what I can get out of a laptop chassis. I don't expect people to make a laptop chassis and design it to throttle because of bad cooling. I don't care what marketing thinks. Put the hardware in the laptop that the chassis and cooling system was designed for. Dell XPS 13? No problems. Alienware 13? No problems. Clevo W230SS? No problems. P640RE? No problems. Aorus X7 Pro? Problems. Gigabyte P34W which can throttle a 970M? PROBLEMS. My "standards" statement is from the people who go "it's a laptop, I don't expect it to work perfectly; if you want it to work properly get a desktop". And as I said, and as you agree in the next paragraph I'll be replying to, that's part of the problem. Every time I mention full-BGA or throttling, TDP-locked hardware to someone outside my circle of friends who also has a decent PC, their instant response is "so? Just get a desktop if you don't want that". Again, tis part of the problem.

    A thin 14" laptop will never, as long as we live, provide the same capabilities as a same-generation 18" DTR. It's simple physics. If Pascal GPUs are so cool and TDP frugal or AMD's Arctic Islands live up to their name that we can shove a full-cored midrange GPU in a 14" laptop with Intel's next CPU architecture's flagship... cool. An 18" DTR can probably shove two of those GPUs in there and overclock the CPU higher. It's a simple matter of how much space is available. What I always say though, is this: if we design our best hardware to fit a certain low limit, we're wasting potential. If we can fit a GTX 980's power into a 75W TDP envelope with Pascal, that's cool... but if that's our mobile flagship, we're wasting power. Why not make a 100W mobile flagship using the same tech? How much extra power could we get out of that? Etc etc.
     
    Ramzay likes this.
  34. Ramzay

    Ramzay Notebook Connoisseur

    Reputations:
    476
    Messages:
    3,185
    Likes Received:
    1,065
    Trophy Points:
    231
    Ok, I get what you're saying now. Yes, that is indeed an unacceptable way of thinking.

    I had the same thing happen with my house (we bought a new-build townhouse in the suburbs, and the suburbs of Toronto are stupidly expensive). The floor guys did a piss-poor job installing the wooden floors, and they were crooked. We complained to the builder, who eventually had the flooring company tear out the floors and do it again. The comment the owner of the flooring company gave him was apparently along the lines of "its a town house, who cares?" The fact its "just a townhouse" doesn't hold him to lower standards than if it was a detached house.

    I guess a lot of people just don't expect much from laptops for some reason. Maybe they're still stuck in the world of a decade ago where even the best/more powerful laptops couldn't hold a candle to a desktop. If you only put the components in a laptop that the chassis/cooling system can properly handle, it'll work just fine.

    I'm extremely picky when it comes to laptops (hence why I've gone through about 6-8 of them in the past year). I don't tolerate whiny loud fans, hot keyboard temps, bad keyboards, flimsy lids amongst other things. Why? To me those problems/issues indicate a poorly-designed laptop. While a throttling CPU doesn't really bother me from a performance point of view, it does irritate me in that if the laptop can't run the CPU without throttling, it was poorly designed.

    Stop making poorly-designed laptops.
     
  35. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,189
    Likes Received:
    17,900
    Trophy Points:
    931
    With the introduction of the k series the gap between laptop and desktop has thinned a fair bit. The anomaly was the initial core series of cpu (vs p4) as the arch was superior.
     
    ghegde likes this.
  36. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Exactly this. In the real world, when you do a poor job of design on anything, especially a machine, you pay for it. You either do it over or you lose your jobs. But in the laptop world, skimping on design seems to be par for the course.

    Yeah, they don't expect much, and they are stuck in the world of a decade ago. And it won't change until people properly review and bash laptops when they do things badly. It's not because we hate the manufacturer, but because they need to improve.

    Yeah since using my D900F and my P370SM3, touching any other laptop is a literal pain in the behind when it's on. Their keyboards are so unbelieveably hot. And the thing is, to others, stuff like the W230SS are on the "exceptionally cool" level, and it's expected that the chassis is that level of uncomfortably warm otherwise. If your design is for X level of power, then put that inside it. And then if someone wants to say "well I can't use that because there isn't enough power", then they're right. That's how it should be. If you need something, buy the form for the function you need. If the function you need is thinness and lightness over power, then buy that form... but don't expect both to magically work.

    Amen
     
  37. Ramzay

    Ramzay Notebook Connoisseur

    Reputations:
    476
    Messages:
    3,185
    Likes Received:
    1,065
    Trophy Points:
    231
    This is something I also don't get. Having grown accustomed to my Alienware 17 R1 and ASUS G751, I expect keyboards to be cool to the touch, even under load. That, to me, is how any laptop SHOULD be designed. Yet, when people say "cool to the touch", a lot of them seem to mean "it doesn't go above 40C". Are you kidding me? 40C IS hot to the touch. Anything above 35C is uncomfortable. Even in @HTWingNut 's reviews, a hot spot is apparently a spot running above 40C, which boggles my mind (no disrespect to HT here, appreciate his work, just pointing out our differing points of view on what constitutes "hot").

    I can somewhat tolerate slight heat build-up when gaming, but when a laptop's keyboard is hot even when idle, I have a problem. The new Alienware 17 machines are guilty of this, and honestly, so was my Clevo P750ZM (though that at least had the excuse of needing to cool a desktop 80W Xeon CPU).

    I'm concerned about the keyboard temps of the P750DM, though since Skylake apparently idles much cooler than Haswell, it shouldn't be so bad. But to me, a hot keyboard is a big deal-breaker.
     
    D2 Ultima likes this.
  38. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,189
    Likes Received:
    17,900
    Trophy Points:
    931
    That's why where possible manufacturers do tend to borrow airflow from the fan to draw it through the keyboard (even though drawing it through the bottom would make temperatures on the chips slightly better). That does get a little difficult to do in this upper mid range size until you get to the size of the P570WM in which case the keyboard stayed at room temperature due to the amount of material between it and the heat sources ^-^.
     
    jaybee83 likes this.
  39. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    But even a W650ST I've directed someone in my country to buy has been unbelieveably cool to the touch. She brought it to me and I gave her the rundown on how to maintain it and what's good and bad practices for using it and it was as cool as my P370SM3's keyboard was. My D900F keyboard in fact got hotter than my P370SM3, in fact, so clevo has been improving on that the whole time. It's something people need to invest in.
     
  40. Ramzay

    Ramzay Notebook Connoisseur

    Reputations:
    476
    Messages:
    3,185
    Likes Received:
    1,065
    Trophy Points:
    231
    I guess some people aren't concerned/bothered by a hot keyboard? I don't personally understand how that could be, but its the only explanation for how the current trend of warm/hot keyboards is tolerated by users.
     
  41. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Because they don't know that it's not supposed to be like that. When they use something better and realize better exists for the same cash, they start raising their standards. But nobody tells them that it exists, so they don't care.
     
    DataShell and Ionising_Radiation like this.
  42. DataShell

    DataShell Notebook Deity

    Reputations:
    47
    Messages:
    777
    Likes Received:
    354
    Trophy Points:
    76
    Macbooks are awful.

    'They're enough for my needs! I don't need premium features. '

    They're way overpriced for what they are.

    'Well that's because they have all these premium features that I need!'
     
    Last edited: Oct 7, 2015
  43. Ionising_Radiation

    Ionising_Radiation ?v = ve*ln(m0/m1)

    Reputations:
    757
    Messages:
    3,242
    Likes Received:
    2,667
    Trophy Points:
    231
    Well said.

    That's Apple for you.

    Premium features? Like what? A shiny silvery design by the holiest design guru, Sir Jonathan Ive, running a wonderful operating system with a 'beautiful' design also by the aforementioned?

    OS X and iOS looked perfectly good before the git that's Jony Ive, came in and ruined it by 'flattening' it. The last OS X that looked good was Mavericks; the last iOS that looked good was iOS 6.

    I downgraded an old iPhone 4S to iOS 6 and I cannot express the feeling of familiarity and relief I felt. iOSes 7+ look childish and unprofessional, and likewise for Yosemite and beyond.

    Jony Ive cannot do software UI design. Anyone who thinks otherwise probably loves Apple too much. Too bad Apple fired Forstall.
     
    D2 Ultima and CaerCadarn like this.
  44. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    Personally I don't mind at all, as long as it helps improve the cooling process.

    If I'm running games or super demanding applications, I usually use external wireless mouse, external keypad and external display.
     
  45. deepfreeze12

    deepfreeze12 Notebook Guru

    Reputations:
    0
    Messages:
    57
    Likes Received:
    61
    Trophy Points:
    26
    So any information of when the new P640RE will be available? And would it be likely to see a i7-6820HK + 6GB 970M or are we stuck with a i7-6700HQ + 3GB 970M? I hope this little beast packs quite the punch... :D
     
  46. Prema

    Prema Your Freedom, Your Choice

    Reputations:
    9,368
    Messages:
    6,297
    Likes Received:
    16,482
    Trophy Points:
    681
    You will be able to get it with 6820HK, but only 3GB 970M
     
    TomJGX, jaybee83 and deepfreeze12 like this.
  47. deepfreeze12

    deepfreeze12 Notebook Guru

    Reputations:
    0
    Messages:
    57
    Likes Received:
    61
    Trophy Points:
    26
    Well, at least it's not all bad news. :D The i7-6820HK will be a very nice edition, just hope the chassis can cool it properly when OCed. :) Also @Prema the max supported ram would be 32GB (2x16GB), right? Also, any information at all about that QHD display? :D
     
    TomJGX likes this.
  48. Prema

    Prema Your Freedom, Your Choice

    Reputations:
    9,368
    Messages:
    6,297
    Likes Received:
    16,482
    Trophy Points:
    681
    32GB, right...no idea about the screen.
     
  49. Ramzay

    Ramzay Notebook Connoisseur

    Reputations:
    476
    Messages:
    3,185
    Likes Received:
    1,065
    Trophy Points:
    231
    Sounds like a desktop at that point.

    For those who use their laptop (aka the keyboard) a blistering hot surface temp is unacceptable.
     
  50. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    well, i wouldnt call "warm" = "blistering hot". dont be such a wuss! :D or cant your soft girly hands take THE HEAT?! *teases* ;)

    Sent from my Nexus 5 using Tapatalk
     
← Previous pageNext page →