The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    NEW!! Alienware Area-51M LAPTOP!! (to replace alienware 15 and 17)

    Discussion in '2015+ Alienware 13 / 15 / 17' started by QUICKSORT, Jan 7, 2019.

  1. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Is 1.07V a typical stable undervolt on an i9-9900K at stock clocks?
     
    Vistar Shook likes this.
  2. Rei Fukai

    Rei Fukai Notebook Deity

    Reputations:
    1,048
    Messages:
    983
    Likes Received:
    1,301
    Trophy Points:
    156
    Well I don't want to acuse anyone of doing anything, but @Mr. Fox has a pretty substantial track record regarding going back and forth with AW. It's probably because of his hard word and dedication with @Prema that AW was the brand that they where. Cuss to be honest even though they made great machines, the machines where the brand that you're warning people from. And that's understandable because you had a bad strike with Clevo. But the same thing also counts for @Mr. Fox and other enthusiast.

    @iunlock has also done some great work regarding tuning, mods but also communication with AW. But the fact is, if AW was the brand you hold them to be, there shouldn't have been enthusiast in contact with a company to "fix" their mistakes.

    I'm also a fan, in love with my R5 (now it's tuned properly) and a AW fanboi. But that has changed cause AW has shown their true colors. Nonetheless I hope this is a turn around cause it's a light ! And light = hope !!

    But a dosis of scepticism can never do harm. Especially when the last 5 years haven't been without problems

    DuZYgIIXQAUAcxE.jpeg
     
    Cass-Olé, Ashtrix, Aroc and 7 others like this.
  3. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,452
    Likes Received:
    12,817
    Trophy Points:
    931
    Waiting to see how many comeback with a recommendation like this one. :D
     
  4. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,520
    Messages:
    5,336
    Likes Received:
    4,280
    Trophy Points:
    431
    Where can I get those shades man? They filter out everything I don't wish to see!
     
    Darkhan likes this.
  5. katalin_2003

    katalin_2003 NBR Spectre Super Moderator

    Reputations:
    14,958
    Messages:
    5,671
    Likes Received:
    1,515
    Trophy Points:
    331
    Coming up.
    Better now?

    Guys, I wasn’t kidding earlier.
     
  6. alaskajoel

    alaskajoel Notebook Deity

    Reputations:
    1,088
    Messages:
    1,031
    Likes Received:
    964
    Trophy Points:
    131
    My sample size is not large enough for me to say if 1.07 is normal from personal experience. Other enthusiasts online seem to agree with Mr Fox that 1.07v is on the low side of average. Anandtech's review shows they needed 1.075v for 4.7. I presume they tried lower voltages since they got to 4.5ghz with just 1.025v.

    What I can say is all of the eleven 9900k CPUs I've tried personally are able to sustain a -.100mv adaptive undervolt using throttlestop. Again, I don't know if this is representative of all 9900k CPUs or just the batch my local Micro Center is selling from. I really don't think there is any doubt a 9700k will stay below 136w sustained, even at stock voltage. You might even have some overclocking headroom.

    Assuming throttlestop is reliable, the % difference in power consumption between my stock and undervolted Cinebench runs are almost exactly the same as what Anandtech saw...about a 25% decrease in total power consumption with a -.100 undervolt, and the scores are certainly in the range of other 9900k chips with stock clocks. Pics are attached below of the runs I just made. This is with an Asus Prime Z370-a motherboard, Corsair Vengeance LPX RAM (32gb, 2666mhz) and a Noctua D15 with a single fan.


    9900k, stock voltage
    [​IMG]

    9900k, -100mv undervolt
    [​IMG]

    Obligatory desktop pic:
    [​IMG]

    A couple comments while I'm thinking about them...
    1. I totally understand the angst felt when you buy something (9900k) and it doesn't do what you expect it to given its specifications. If Alienware is situationally limiting the CPU to 119w, 136w or something else, I absolutely find it disingenuous of Dell to sell the 9900k in completely stock form without telling the customer about it and knowing it will underperform compared to what a reasonable person would otherwise expect from the product (stock voltage, 4.7ghz on 16 threads.) I don't think its reasonable to expect everyone purchasing from a mainstream brand will know how to undervolt their CPUs to get the performance they expect. This is the gamble our first adopters are taking and I thank them for it. With that said, I hope the first adopters are also going into this purchase knowing they may need to fiddle with an undervolt to have the performance experience they expect AND they should be okay taking on the risk of a slightly lower performing CPU since we don't have all the answers. These screenshots of my current CPU do not mean yours will perform exactly the same way in the 51m with different memory, motherboard and firmware that will almost certainly be artificially limited for reasons we don't completely understand. Some of these problems we might ultimately find solutions for and others we might be stuck with.

      I completely agree there is a principled argument for saying "I'm not getting what I paid for! Shame on Dell!" if this turns out to be true, but that's also why this community exists...to fact check the stuff manufacturers say and to give you the information needed to make an informed decision. I have never met @Ultra Male in real life, but I love this guy because he and others like him are often the first to purchase these new crazy machines and share so much valuable information with us. The last thing we should be doing is harshly criticizing the decisions of our first adopters with hurtful words when they are ultimately going to help answer our questions... if they don't get scared off first.

    2. Part of the problem with reviewing a CPU's performance is the results are often presented on a continuous scale (like these Cinebench scores for example), but our satisfaction with a device is typically a more binary experience. "Does the CPU let me perform the work tasks I need to do in a satisfactory way?"..."Can I get 60fps in PUBG and stream at 720p60 simultaneously without dropping frames?"..."Can I play a game while watching Netflix on a different screen?", etc.

      The difference between a 9900k with 8 cores at 4.5 or 4.6 compared to the stock 4.7 might not be substantively impactful to what you want or need from your laptop in the slightest. If this is the case for you, don't let the melodramatic comments of some folks get you worked up. Stop worrying about it. Enjoy your laptop and don't obsess over a 1950 Cinebench score vs a 2050 score. I can't stress how little of a difference this is. Even if the CPU has to downclock to 4.4 on 16 threads to maintain a 136w limit, this is still a crazy device and with 8 cores, it is much stronger than most other 17" laptops. If you don't like the compromises other 8-core machines from MSI, ASUS or Clevo make in their desktop replacements, this might still be what fits you the best. If however, you are the person who really enjoys benchmarking and you find it fun to get the highest absolute scores from your device by overclocking, this might not be your dream machine...and that is also perfectly okay.

      I am not a crazy (meant in a good way!), overclocking, benchmarking enthusiast, although I completely respect those who enjoy doing so. This just means when I read comments from this type of NBR member, I still assimilate their contribution but I might not consider that member's perspectives as impactfully as another member whom I more closely identify with regarding workflow, computing compromises and performance priorities. :)
     
    Tony Yang, Ashtrix, Aroc and 13 others like this.
  7. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    1,470
    Messages:
    3,438
    Likes Received:
    3,688
    Trophy Points:
    331
    Thanks for the info. I was very surprised at the low voltage. If around 1.07V is stable for the average 9900K at 4.7GHz on 8C/16T, that's an amazing improvement in the voltage/frequency curve compared to 8th gen Coffee Lake. I might've even believed you if you told me 9th gen uses 14nm+++ or 10nm :). Then again, I've also heard that the 9900K is the creme de la creme in terms of binning, and lower tier chips like the 9700K and 9600K do not generally have such optimal V/F scaling. Anecdotally this seems to be reinforced by @GizmoSlip 's recent review of the Eurocom X4C, where he mentioned no change in power/heat output and temperatures between the 9700K and 9900K in that system after undervolting both.
     
    Ashtrix, Latostno, alaskajoel and 2 others like this.
  8. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,624
    Trophy Points:
    931
    Good info. Thanks for sharing it. Should be very useful for folks that buy this to work out some stock tuning optimization. :vbthumbsup:

    Nice looking desktop as well.

    So, to go along with the previous post, here are 9900K stock laptop wPrime 32M and 1025M results from @Prema that can be used as yet another measuring stick. One should note the difference in default voltage versus the undervolt that @alaskajoel showed us to be possible on his desktop. That may or may not be possible with the same success on every platform, and with varying CPU bin quality, but it's something that can be tested and adjusted to find the lowest possible stable undervolt on any given machine as a starting point. Trial and error is how it is done... no rocket science.

    wPrime-9900K-Stock.jpg

    And, for the sake of reader convenience, here are the other two Cinebench examples previously posted.

    9900K_Prema.jpg 9900K_Stock_P870.jpg
     
    Last edited by a moderator: Jan 13, 2019
    Cass-Olé, Aroc, raz8020 and 7 others like this.
  9. iron_megalith

    iron_megalith Notebook Geek

    Reputations:
    5
    Messages:
    82
    Likes Received:
    27
    Trophy Points:
    26
    Exactly why I don't really mind if this machine doesn't have that much headroom for pushing benchmarks. But it will be great to know how a 9900k will perform vs a 9700k setup.

    For those who like those super bulky devices, more power to them. However, I have to find the center between power and portability. I travel a lot and need a device that's able to handle as much of my work load as it can. I really like a desktop but I feel that I am not getting what I paid for since I'm out of my home most of the time due to work.

    In all honesty, the 17R5 was spot on. Not too heavy and not too thick. Pretty decent performance but could have been more because of the bad thermals. At one point, I really wanted to keep this device. If the Area-51m can perform properly on thermals after a proper repaste(bonus points if Dell improved their thermal paste) and has a little bit more room to wiggle, I'm fine with that. I don't really care if it doesn't match the full potential of a desktop. If someone recommends me to build a kickass ITX setup instead, I will kindly show them the exit.

    As someone said before, different people have different needs. A lot of you guys are clamoring about the headroom for pushing bench but I'm here getting irritated just by looking at the other Type-C port that became a Type-A. That is something I want to ask Frank or someone from Dellienware WHY? I really needed more Type-C ports not less. You can easily get a dongle to address a missing Type-A port. Not vice versa.
     
    Last edited: Jan 13, 2019
  10. Spartan@HIDevolution

    Spartan@HIDevolution Company Representative

    Reputations:
    39,567
    Messages:
    23,559
    Likes Received:
    36,825
    Trophy Points:
    931
    Well it has 3 USB ports and 1 Type A right? That should be enough if you get one of them Type A to type C adapters. I have one and it gives me 4 additional USB 3.0 ports + a LAN port (which I don't need obviously but still)

    On a side note, if you get the Alienware 17R5 from HIDevolution they have sorted out the temp sand tamed them using their proprietary thermal mods. The good thing about getting a laptop from them is no matter which laptop you choose, they will certainly improve on whatever stock thermals it came with be it using a different thermal paste or thermal mods (ie. Fujipoly Extreme Thermal Pods, bottom panel mods, etc.)

    For example, this is the bottom panel mod which they did on my MSI GT75 Titan which reduces temps by 5C:

    [​IMG]
     
    Aroc, Rei Fukai and jclausius like this.
  11. alaskajoel

    alaskajoel Notebook Deity

    Reputations:
    1,088
    Messages:
    1,031
    Likes Received:
    964
    Trophy Points:
    131
    Good question...although, a fellow 15R4 owner and I were chatting the other week and he was pretty upset that his side usb-c port "didn't work." Digging a little deeper, he tried to use it as a video output with a usb-c to HDMI adapter and the side usb-c port doesn't carry video... Only the rear type c port carries video (albeit only from the iGPU) and TB3.

    After clearing that up with him, I realized it's not so crazy for someone to think it would be broken had they not read the manual. The type-c standard is a convoluted mess, so maybe Dell decided to avoid the explanation entirely in the future and only include one "do it all" type-c port?
     
  12. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,520
    Messages:
    5,336
    Likes Received:
    4,280
    Trophy Points:
    431
    Well it could work fine on a display link type panel, or at least it should.
     
    Darkhan and alaskajoel like this.
  13. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,452
    Likes Received:
    12,817
    Trophy Points:
    931
    So here is a test run as well.
    9900k-test.PNG
    9900k-test-1.PNG
     
    Last edited: Jan 13, 2019
    iunlock, Ashtrix, Aroc and 7 others like this.
  14. Spartan@HIDevolution

    Spartan@HIDevolution Company Representative

    Reputations:
    39,567
    Messages:
    23,559
    Likes Received:
    36,825
    Trophy Points:
    931
  15. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,452
    Likes Received:
    12,817
    Trophy Points:
    931
    Now to also add real time picture grabs at the highest and lowest points of the infamous stability testing. As we all know. You have to have the first 3 boxes checked for it to be valid. Which they are.

    Also to add. No AC. No mods. Fully assembled. Max fans. ICDiamond.
    9900k-test-2.PNG 9900k-test-3.PNG 9900k-test-4.PNG
     
    Cass-Olé, Aroc, raz8020 and 2 others like this.
  16. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    So basically, even in a P870TM1, the i9-9900K thermal throttles at stock clocks?
     
    c69k, raz8020 and Vistar Shook like this.
  17. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,452
    Likes Received:
    12,817
    Trophy Points:
    931
    Sorry bud, but they all throttle when running at 100 percent load over a long period of time. The point being made is. Which makers can stay above 4.0 ghz under full load. And that is a thermal throttle and not a power throttle that most seem to be talking about.

    Here is a 100% real world test.
    9900k-test-5.PNG 9900k-test-6.PNG 9900k-test-7.PNG

    This is what everyone is up against. And that is where the bar is set.

    Thanks, Prema's score is better through. :)
     
    Last edited: Jan 13, 2019
    Rage Set, Cass-Olé, ssj92 and 7 others like this.
  18. jclausius

    jclausius Notebook Virtuoso

    Reputations:
    6,160
    Messages:
    3,265
    Likes Received:
    2,573
    Trophy Points:
    231
    My favorite line from this video regarding laptops becoming obsolete, "Because you've never been able to swap out the graphics card... until now." First time I saw it, I did a spit take. And even now, I still get a big guffaw from that!

    Second favorite line in regards to going with an LGA based laptop in the Area 51m, "Why is Alienware finally doing this now?... Alienware was growing frustrated with compromises with ever thinning laptops, it decided to do something 'thicker' for once." :D

     
    Last edited: Jan 13, 2019
    Ashtrix, Aroc, raz8020 and 3 others like this.
  19. alaskajoel

    alaskajoel Notebook Deity

    Reputations:
    1,088
    Messages:
    1,031
    Likes Received:
    964
    Trophy Points:
    131
    For what its worth, I have an MSI WT-75 with a modified bios to allow the 9900k and it doesn't thermal throttle with an undervolt at stock clocks. This is with a -100mv undervolt at stock clocks using an IC graphite pad. Under a P95 load it tops out at 93c with max fans.

    The power draw readings are incorrect in Throttlestop because the motherboard has a hard power limit at 95w which we can only get around by modifying some IMON settings in the bios. This is with IMON offset at 31000 and IMON slope at 50...so if my math is correct, the max power value is really about 138w. Without the undervolt, it thermal throttled down to 4.2ghz and without an undervolt on stock IMON settings it PL throttled to 95w and 3.6ghz.

    @win32asmguy and myself have both tested this setup on the WT-75

    [​IMG]
     
  20. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,452
    Likes Received:
    12,817
    Trophy Points:
    931
    Maybe i'm missing something....So let me ask this....Do you see me thermal throttling running Cinebench R15? My answer would have to be no.

    So, as CR15 was used as an example, lets also use a real world example. Handbrake and encoding a 4k video or a non real world Aida64 stress test. with as much info in the screen shots as possible.
    Yeah, I don't see any throttling in there...

    9900k-test-1.PNG
     
    Last edited: Jan 13, 2019
    Cass-Olé, Ashtrix, ssj92 and 6 others like this.
  21. jclausius

    jclausius Notebook Virtuoso

    Reputations:
    6,160
    Messages:
    3,265
    Likes Received:
    2,573
    Trophy Points:
    231
    This feature has not been lost on me either! A great idea on keeping one's video as current as one wants.
     
    Aroc, raz8020 and Vistar Shook like this.
  22. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,452
    Likes Received:
    12,817
    Trophy Points:
    931
    Yeah, it had me in a WTH moment. And the other stuff seems to be the norm when people haven't used high end laptops for a long time.
     
    Mr. Fox, Aroc, raz8020 and 2 others like this.
  23. Lunatik

    Lunatik Notebook Evangelist

    Reputations:
    83
    Messages:
    459
    Likes Received:
    267
    Trophy Points:
    76
    No different then literally every company saying they have “THE THINNEST GAMING LAPTOP EVER!” or “THE FIRST LAPTOP WITH A 2080 IN THE WORLD” at ces lmao. I laughed everytime someone said either of those statements.
     
    Aroc, jclausius, raz8020 and 5 others like this.
  24. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,452
    Likes Received:
    12,817
    Trophy Points:
    931
    I have to agree 100%!
     
  25. iron_megalith

    iron_megalith Notebook Geek

    Reputations:
    5
    Messages:
    82
    Likes Received:
    27
    Trophy Points:
    26
    I could see that happening. I mean that was an issue on the Wacom Cintiq Pro 16 as well since not all ports are for display. I think a proper label is needed for those kinds of scenarios. It doesn't really need to be changed to Type-A.

    While connecting to the Type A is still fast, I can definitely see that connecting to USB Type-C is faster. Both in Crystaldisks test and real world test. I'm using a 2TB Samsung Portable SSD T5. Drive is formatted to exFAT due to NTFS compatibility issues with Android. Whenever I'm doing a transfer of a big video file(20GB), I get around 320MB/s for Type-A and 356MB/s on USB Type-C. Regardless though, it just boils down to preferences and convenience. Ever since I used USB-C, I never want to use any other USB types ever again.
     
  26. Homer S

    Homer S Notebook Evangelist

    Reputations:
    11
    Messages:
    314
    Likes Received:
    86
    Trophy Points:
    41
    What I expect is that my chosen GPU(s) not cause the PS to brown out as the PS doesn't provide enough juice. If that is because the lappy and/or GPU makers got together to protect me from myself, boo!!!

    Homer

    He's hard because they have already done it that way. If I'm spending $4000+ on a laptop, I better get performance to run games where I want them now and a few years down the road. The M18xR2 almost carried it off, if you were willing to strap two PSUs together. Everything since then couldn't do it, without herculean efforts and the BIOSGuard stops unlocking. If Dell sends these out the door locked... Clevo here I come. I'm already sad I'll have to give up 18" soon... when Win 7 goes patch-less it will be the wild west and open season!

    Homer
     
    Last edited by a moderator: Jan 15, 2019
  27. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    If it works I don't care how it looks. Hell slap a x700k and a xx80 class GPU or its equivalent in an old IBM ThinkPad style chassis ibak happy as long as I can get my performance out of it.

    That's my dream laptop actually. Nice big thick military/business style laptop that has performance and upgrades for days. I might settle for some subtle RGB as well. Lol

    Sent from my LM-Q710.FGN using Tapatalk
     
    Aroc and jclausius like this.
  28. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    That would do it. That's what pushed me to my MSI gt83. At the time I wanted something that I could get expect support and if it didn't do what I wanted return to my local store.

    Now I think they got a winner in that WT75, maybe this alien ware and I might look at clevo again.



    Sent from my LM-Q710.FGN using Tapatalk
     
    Spartan@HIDevolution likes this.
  29. Falkentyne

    Falkentyne Notebook Prophet

    Reputations:
    8,396
    Messages:
    5,992
    Likes Received:
    8,633
    Trophy Points:
    681
    I hope that's a negative -31000 Imon offset and not positive :)
    I have taught you well, young Jedi.
     
  30. Falkentyne

    Falkentyne Notebook Prophet

    Reputations:
    8,396
    Messages:
    5,992
    Likes Received:
    8,633
    Trophy Points:
    681
    The WT75 (I'm also assuming the MSI F7 will use this design with 6 CPU heatpipes without the GPU VRM puking on a 3 heatpipe solution like we saw on the GT75 Titan + 1080/1070 systems; the RTX version of the BGA MSI GT75 Titan also uses the new CPU heatpipe design rather than the aborted fetus version used in the GTX MSI GT75 @Papusan @Mr. Fox ) would actually be a half decent laptop, if it weren't held back by EC limited BGA cancer code. The EC doesn't detect the 9900K CPU as a valid SKU, so the bios power limit overrides (bios can go up to 500W power limit 1) get ignored by the EC and it enforces 95W TDP. 16L13 had this problem (brother @Prema removed the EC TDP issue but then the laptop would just shut off if both CPU and GPU were loaded simultaneously), now the WT75, RTX GT75 and F7 will also have this problem.

    Why didn't this happen on the GT75VR and GT73VR? (e.g. 230W TDP GTX 1070 + 100W CPU+?). Because these systems had to allow a SLI configuration (at least 230W through 2xMXM), so 230W through 1xMXM works without the laptop shutting off.

    The only redeeming quality about the MSI F7 / WT75 and the BGA RTX cancer versions are that unlocking the bios is still possible to trick the EC by spoofing the CPU power draw reporting (IMON SLOPE=50, IMON Offset= (-)31999.
     
    Aroc, Mr. Fox, Ashtrix and 3 others like this.
  31. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    Ok I get what you are saying now I gotta ask, if I am going to be running an x700k series CPU is that 95w TDP going to cause me issues with stock or a slight over clock? Because 9900k isn't gonna find its way into my machines because I have no idea what it offers above the 9700k other than hyperthreading. IMO I would rather play with ryzen 3k series by the look of it anyways, I digress.

    The GPU I get you on why can't we be offered full fat like on the desktop. Hell most of us don't even use SLI anymore so I figure if we can get decent AIO level cooling in it why not.

    Sent from my LM-Q710.FGN using Tapatalk
     
  32. yggdra.omega

    yggdra.omega Notebook Geek

    Reputations:
    2
    Messages:
    82
    Likes Received:
    45
    Trophy Points:
    26
    How can you tell if the BIOS is locked/unlocked or other preventative measure to get in the way of overclocking how this thing should be?
     
  33. jclausius

    jclausius Notebook Virtuoso

    Reputations:
    6,160
    Messages:
    3,265
    Likes Received:
    2,573
    Trophy Points:
    231
    If you can find anything in the model specs that says it ships with 'BootGuard', you will not be able to flash BIOS updates for the machine outside of official DELL/AW released BIOS/UEFI. So that is one measure taken to keep the machine UEFI only running as Dell/AW intends.

     
    Last edited: Jan 15, 2019
    raz8020 likes this.
  34. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,035
    Messages:
    11,277
    Likes Received:
    8,814
    Trophy Points:
    931
    Yeah 4th gen onwards bypassing Intel boot guard is difficult.
     
    jclausius likes this.
  35. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,624
    Trophy Points:
    931
    And, that is terribly unfortunate. Micro$lop is the driving force behind this, and some of the OEMs are willing to be muppets and drink their Kool-Aid. It is nobody's business what an end user does with their own private property. If one chooses to run Legacy Mode and use MBR instead of UEFI with GPT (with or without Secure Boot) that should be available to them with no interference whatsoever. The OEMs and Micro$lop should not have any voice in this at all, but they've taken that to the extreme by totally blocking access to configuration options. Using force by eliminating configuration options is unethical. Those that want to flash modified firmware should be able to do so without Nazi-grade interference as well. This is more about freedom and configuration options than it is about overclocking. I run modded firmware on my router, too. I own it and it should be my decision, not theirs.
     
    Awhispersecho, Aroc, ssj92 and 6 others like this.
  36. iron_megalith

    iron_megalith Notebook Geek

    Reputations:
    5
    Messages:
    82
    Likes Received:
    27
    Trophy Points:
    26
    I tried 4K on the 17R5, and I'm not really impressed with it so much I needed to scale the UI to around 200-250% for it to be readable. I'm really hoping that they release a 1440p IPS Panel(120Hz please) since I feel that it is a much more reasonable choice. However, I'd like to to hear your opinions.

    Is 4K really a waste for 17 inch laptops? As stated earlier, I need the machine for productivity.

    I wasn't able to test my device for Video Editing too much as I didn't want to transfer my license needlessly to a device that I wasn't sure on keeping. Tried it on gaming though, and it's meh. Simply because you're moving too much to really appreciate it.
     
  37. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    4k is a waste for anything under 32 inches imo. You would be much better serviced by a fast 1440p screen.

    Sent from my LM-Q710.FGN using Tapatalk
     
    TBoneSan, Mr. Fox and iron_megalith like this.
  38. iron_megalith

    iron_megalith Notebook Geek

    Reputations:
    5
    Messages:
    82
    Likes Received:
    27
    Trophy Points:
    26
    Already wish this option is available soon but I digress.

    So it really doesn't benefit something like 4K Video Editing at all?
     
  39. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    Maybe for video editing but that's outside my area. You would most likely see a increase in sharpness but I doubt it, I have "fairly good" vision as it goes and it's not very noticeable to me.

    https://www.sven.de/dpi/

    I used this and with a 1440p 17.5" screen you roughly get the same PPI as you do on a 4k 27"

    Sent from my LM-Q710.FGN using Tapatalk
     
  40. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,624
    Trophy Points:
    931
    Yeah, I totally agree with this.
    For the reason you noted, 4K on a small display is pretty dog-gone miserable for most folks. If you up the scaling to make text pleasant to read, you lose any benefit gained in terms of usable screen real estate. I loathe 4K on anything other than a very large display. I only enjoy 100% scaling. Unless you are really into professional photography and advanced image editing, there is no practical benefit to it. Gaming at 4K with high graphics settings is also pretty crappy unless you have an abnormally powerful desktop. There are not any notebooks (yet) that have the horsepower needed to do it gracefully with high graphics settings.
     
    Vasudev, Ashtrix, Papusan and 3 others like this.
  41. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    Yea, I don't get the rush to 4k. My GPU runs quite happily 1440uw@95htz at high or ultra, and on the 1080p internal it barely makes a whisper.

    If you grab a 4k make it a big one. I got a 4k Tv for my computer to plug into and it looks as sharp as my dock display.

    Sent from my LM-Q710.FGN using Tapatalk
     
    Papusan, TBoneSan and Mr. Fox like this.
  42. iron_megalith

    iron_megalith Notebook Geek

    Reputations:
    5
    Messages:
    82
    Likes Received:
    27
    Trophy Points:
    26
    Various people seem to say that 4K has better colors compared to other panels. I'm trying to find the reason why that would be the case. If that is the case, it really should be no brainier to get that option for my use case. I'm not really doing professional photography but will handle Video Editing and Digital Painting. Both of which will need the best color accuracy a monitor can provide.

    So far I haven't seen an in-depth comparison vs FHD/QHD/UHD in this regard which is a little frustrating.
     
    Vasudev and Mr. Fox like this.
  43. Donald@Paladin44

    Donald@Paladin44 Retired

    Reputations:
    13,989
    Messages:
    9,257
    Likes Received:
    5,842
    Trophy Points:
    681
    The best way is to Google the screen model number to find the specs. It is difficult to 'compare' screens unless you have them side by side.
     
    Aroc, raz8020, Vasudev and 3 others like this.
  44. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    I wouldn't be surprised is a lot of 4k panels have better color. As a lot of them where originally designed for content creation and derived from there.

    Sent from my LM-Q710.FGN using Tapatalk
     
  45. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    Likely it is because prior 1440p screens were TN where as most 4k panels have been IPS

    Sent from my Pixel 3 XL using Tapatalk
     
    Papusan, iron_megalith and Mr. Fox like this.
  46. iron_megalith

    iron_megalith Notebook Geek

    Reputations:
    5
    Messages:
    82
    Likes Received:
    27
    Trophy Points:
    26
    Good idea. However, these things are typically obscured whenever you are purchasing the units. Not unless you have a sales agent pull it out.

    If it was because of TN Panels, it kinda makes you wonder if Color Accuracy on 1080p IPS panels are on par with 4K IPS panels.
     
  47. Donald@Paladin44

    Donald@Paladin44 Retired

    Reputations:
    13,989
    Messages:
    9,257
    Likes Received:
    5,842
    Trophy Points:
    681
    Until some have shipped and users or reviewers have them, that is probably the only way to get them.

    Even for older models/screens, you won't find the model numbers published by the manufacturers. Typically, forums with members that actually have them, Reviews, or your friendly, knowledgeable sales agent, are the only places you will find screen model numbers.
     
  48. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,624
    Trophy Points:
    931
    There is also a subjective element to it. What looks nice to some eyes does not look any better or different to others. IPS is not always sunshine and roses, especially a low cost IPS panel. IPS is getting much better now that it is becoming more mainstream, but nothing is, or ever will be, perfect. If you cannot see anything better or beneficial to it, paying extra for it doesn't make much sense unless you plan to sell it before it becomes obsolete. If it is more popular, then there is a chance that it might be easier to sell later.

    Plus, 4K is not universally popular, and may not be for a long time (if ever) due to scaling issues and poor text readability on small screens. It may never be viewed by most as being a great option on small screens. And, that means it may never become common because display panel manufacturers are not likely to invest in products that are not popular. To some extent this is true for 1440p screens as well. The majority of the laptops sold are low cost trashbooks, and expensive screens that add to their cost make no sense. It is difficult to find a low cost consumer laptop even with 1080p at this point. Most are still pathetic 1366x768 garbage.
     
    Darkhan and Papusan like this.
  49. iunlock

    iunlock 7980XE @ 5.4GHz

    Reputations:
    2,035
    Messages:
    4,533
    Likes Received:
    6,441
    Trophy Points:
    581
    The same story and exact scenario exists in the phone arena where the bootloader is locked down, preventing users to unlock/root their devices to have the freedom to do what they want. That is completely unethical I totally agree.

    I agree. 1440p is the sweet spot in so many ways and it has been my go to resolution for everything.

    Yup, it's all about PPI. Having used my MBP for years I've been spoiled by not seeing dot matrix on the screen lol. This is what led me to really become accustomed to 1440p as it was the closest to apples odd resolution of 2880x1800 on the 15 inch MBP. What baffles me is the ultra wide screens that came out still with 1080p vertical res...it was dot matrix and looked like 720p.

    On a 17 inch laptop screen for example, although I don't favor 1080p, I would take a good FHD panel with high refresh rates over a 4K @ 60Hz. I really think 1440p is the sweet spot right now and even with the release of the 20 Series cards, 4K is just not there yet with its panel technology to offer a refresh rate that is beneficial or practical to gamers.

    4K does have its purpose, but not on a small display. I have a 28" 4K Samsung Monitor that I use for charting and it works well for that, but anything smaller it starts to get unbalanced in my opinion, often causing eye strain.

    In short, 4K @ 60Hz on a high end gaming laptop is a complete waste and a total oxymoron. Unless of course the user plans to use an external monitor with high refresh rate the majority of the time, but then the question is wouldn't a desktop be better if the laptop lives at a desk most of the time connected to an external monitor? This also sheds light on how small the niche crowd is who actually needs a 4K on a dtr for their work, which is small.
     
    CaerCadarn likes this.
  50. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,624
    Trophy Points:
    931
    Yeah, it really sucks. I've never been one to spend a lot of money on something like a phone, but it really irks me to no end... even on cheap ones. Now I have even less incentive to purchase an expensive smartphone. Maybe one of these days enough people with more than average technical knowledge and skills will get fed up to the point that they stop spending money on garbage long enough to inflict financial harm on the Nazi retards that pull this kind of crap. Feeding the hand that bites you is what brought us to this point.
     
    Darkhan, raz8020, Papusan and 2 others like this.
← Previous pageNext page →