The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    *OFFICIAL* Alienware Area-51M R1 Owner's Lounge

    Discussion in '2015+ Alienware 13 / 15 / 17' started by ssj92, Jan 8, 2019.

  1. alexnvidia

    alexnvidia Notebook Deity

    Reputations:
    434
    Messages:
    1,386
    Likes Received:
    622
    Trophy Points:
    131
    what was the part about you are the only one who knows what you are doing when it comes to OC rtx gpu and you are world no.1 BS this and that? ya just what i thought.

    a little sense of humility goes a long way. instead of boasting, try being a bit more humble
     
    Last edited: Feb 3, 2019
    Darkhan and hmscott like this.
  2. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    Would undervolting help more for the RTX 20XX mobile series? Or is there more things going on this time around preventing better performance?
     
    alexnvidia and Vistar Shook like this.
  3. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    It’s not BS if it’s true though. He’s legit world #1.
     
  4. alexnvidia

    alexnvidia Notebook Deity

    Reputations:
    434
    Messages:
    1,386
    Likes Received:
    622
    Trophy Points:
    131
    There are good reasons why many are scared of ngreedia. almost 3 years waiting and they gave us a laptop GPU that's slightly faster than GTX 1080 while jacking up the price and making us pay for features that are either half baked or MIA. some of you might be well off or have easy access to new laptops, but the fact is most of us dont and it's our hard earned money that goes to ngreedia (it's not like we have choice or anything. AMD where are you??) speculations tend to arise when something gets overcharged and underdeliver. and when solid reviewers like Jarrod'sTech showed the whole world how poorly rtx 2080 OC, one can't help but wonder what's really going on here. sure some might say it's all speculation, but others (people like me) would argue otherwise based on educated guess.

    We all know Turing is a HUGE GPU, and when it comes to huge GPU, chances of failure or poor performing chips are high. that's just the nature of silicon manufacturing and nothing goes to waste. for example, certain defective die areas gets blocked off and they market it as a lower tiered product (ie 2060 is a recycled 2070 that didnt make the cut). this is called die harvesting. those that dont make the cut for desktop 2080 might just get recycled, reduced clocks significantly and marketed as MAX Q or laptop lower clocked GPU. OC GPU is not rocket science, not if you are not aiming to be world "no.1". Having stability issues at just over 70MHz OC sounds off.

    having said that, we definately need more data from more 2080 OC to determine if this die harvesting theory is correct.
     
    Last edited: Feb 3, 2019
  5. Terreos

    Terreos Royal Guard

    Reputations:
    1,170
    Messages:
    1,847
    Likes Received:
    2,264
    Trophy Points:
    181
    16 more days. . .come on dell be a pal and send mine early. :D

    BTW my Area 51m back pack got delayed till feb 19-21. Which is fine as I don't have a use for it before then.
     
    Last edited: Feb 3, 2019
  6. c69k

    c69k Notebook Deity

    Reputations:
    1,118
    Messages:
    1,301
    Likes Received:
    1,032
    Trophy Points:
    181
    It must be a real kick having 9900K !!! I am so hungry for tweaking a new laptop. My next will be definitely with an unlocked and desktop processor. The new laptop will have to have 10-bit OLED 120Hz G-sync, otherwise no deal.
     
    Vasudev and Vistar Shook like this.
  7. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,469
    Likes Received:
    12,881
    Trophy Points:
    931
    Yes. Just like all the other versions before it. You need to undervolt the card to stay below it's 150W ceiling.
     
    Rei Fukai, ssj92 and Vistar Shook like this.
  8. propeldragon

    propeldragon Notebook Evangelist

    Reputations:
    122
    Messages:
    536
    Likes Received:
    365
    Trophy Points:
    76
    Personally after overclocking several different 10 series cards. Your gpu is definitely more efficient. Never been able to use that high of voltage without hitting the power limit. Under air of course.
     
    Vistar Shook likes this.
  9. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,469
    Likes Received:
    12,881
    Trophy Points:
    931
    Mine is more efficient because mine are pretty much always modded. We don't sit and "Drink the Nvidia kool-Aid" Why do you think most people want PremaMod systems? Because the GPU and bios are almost always modded my friend.

    Edit:
    For the record, the 1080N was able to reach 300+W. This can never be done on "Stock Firmware"
     
    Last edited: Feb 3, 2019
    Rei Fukai, Vasudev, ssj92 and 4 others like this.
  10. HaloGod2012

    HaloGod2012 Notebook Virtuoso

    Reputations:
    766
    Messages:
    2,066
    Likes Received:
    1,725
    Trophy Points:
    181
    Exactly why I went for the P775 from HID


    Sent from my iPhone using Tapatalk
     
    Ashtrix, ssj92, raz8020 and 2 others like this.
  11. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,469
    Likes Received:
    12,881
    Trophy Points:
    931
    Which makes perfect sense!

    Right now sitting here wondering how this A51M is going to pan out.
     
    ssj92 likes this.
  12. Vistar Shook

    Vistar Shook Notebook Deity

    Reputations:
    2,761
    Messages:
    1,256
    Likes Received:
    1,362
    Trophy Points:
    181
    So the single and sli 1080N firestrike score you posted is TDP modded to 300+? wow...nice.
     
    Rei Fukai and raz8020 like this.
  13. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,469
    Likes Received:
    12,881
    Trophy Points:
    931
    Yes.

    So with how many watts that card can pull, you would think they would give us a bit more. Since it has been proven time and time again what it can handle. What do they do? Come out with some nonsense 150W ceiling when the card can do 300W+ legitimately with how it is currently configured. BGA would of course be different, but non the less just as powerful this round.
     
  14. IKAS V

    IKAS V Notebook Prophet

    Reputations:
    1,073
    Messages:
    6,171
    Likes Received:
    535
    Trophy Points:
    281
    Cmon Nvidia we waited 3 years for this!???
    The more I look into it the more I think I might skip this gen or just get a cheap RTX 2060 laptop.
    I’m still holding out for 51M reviews but they better live up to the hype.
    I hope we don’t see another 3 years between new GPU’s.
    I really like the M51 but Nvidia really dropped the ball pushing RTX when it should have just focused on upping performance.
    RTX is not ready for primetime.Who asked for this anyway?
     
    Vistar Shook, alexnvidia and hmscott like this.
  15. dodgehemi0

    dodgehemi0 Notebook Evangelist

    Reputations:
    34
    Messages:
    622
    Likes Received:
    80
    Trophy Points:
    41
    OH SNAP!!!!!


    Sent from my iPhone using Tapatalk Pro
     
  16. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,737
    Messages:
    29,856
    Likes Received:
    59,711
    Trophy Points:
    931
    Nvidia just follow the notebook trend. I'm sure they don't see the point push out mobile graphics with higher or equal TGP as they did with first gen laptop desktop cards (Maxwell N). The cooling in many of todays/tomorrows (coming) Gaming laptop models won't handle it.
    upload_2019-2-3_18-20-8.png
    They know very well that almost all Notebook manufacturers today have banned/abandoned thicker gaming laptops for tinner Apple design.

    The few thicker laptop models we still have doesn’t get the Nvidia desktop power due the love for thin and flimsy Joke-books.
     
    Last edited: Feb 3, 2019
  17. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,469
    Likes Received:
    12,881
    Trophy Points:
    931
    Vistar Shook and Papusan like this.
  18. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,737
    Messages:
    29,856
    Likes Received:
    59,711
    Trophy Points:
    931
    Yeah, its sad.
    The Future of Gaming is Thin on Pounds, Heavy on Pixels. By MSI's ceo: Charles Chiang

    "When it comes to his company's bread and butter products, gaming PCs, Chiang believes the future is thin, mobile and high resolution. He said he expects to see slimmer and lighter gaming laptops become a much bigger portion of the market in the years ahead, particularly as new processes like Intel's 10nm and AMD's 7nm bring greater power efficiency."
     
    Ashtrix, Vistar Shook and raz8020 like this.
  19. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,469
    Likes Received:
    12,881
    Trophy Points:
    931
    And I'm guessing this guy must have a lucky 2080 as well....
    https://www.3dmark.com/fs/18149840
    lun-msi-2080.PNG

    We have plenty of folks who know what they are doing, but we can't seem to get compensated from Nvidia in the form of a performance boost. Just performance nonsense.
     
  20. IKAS V

    IKAS V Notebook Prophet

    Reputations:
    1,073
    Messages:
    6,171
    Likes Received:
    535
    Trophy Points:
    281
    Is that from the MSI GE75 Raider?
     
  21. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,469
    Likes Received:
    12,881
    Trophy Points:
    931
    I'm not sure, but it's beating a Area 51M just like everyone else is in the gpu department.....Well, until they show their hand that is. You know....not using fake heatinks designed for 2060's and 8700s's and all.
     
  22. Kuro Kensei

    Kuro Kensei Notebook Consultant

    Reputations:
    115
    Messages:
    168
    Likes Received:
    231
    Trophy Points:
    56
    So, your CPU+GPU pulled 400W+ during this run? Wow, my Ryzen + Vega 10 rocks! It scored 1/10 of your result at 1/50 of the power draw! lol
    On a more serious note, I do hope Dell's cards will be possible to mod and push past 300W in the end. Even if that requires a custom modded PSU as well.
     
  23. iron_megalith

    iron_megalith Notebook Geek

    Reputations:
    5
    Messages:
    82
    Likes Received:
    27
    Trophy Points:
    26
    Holy crap. If this is true, I'm glad I ditched the 17R5. If this is true then customized once should have a lot more to offer.

    I really want to see more tests for this unit.
     
  24. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,737
    Messages:
    29,856
    Likes Received:
    59,711
    Trophy Points:
    931
    Dell has already confirmed Cpu power cap. Why shouldn't they do exactly the same for PSU? They have done it before. You will most likely need an fully unlocked firmware. But not sure this will help either... Because if you unlock Razer's firmware this doesn't help.
     
    Last edited: Feb 3, 2019
    Ashtrix, Vistar Shook and Rei Fukai like this.
  25. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    One thing is for sure. It can use 330W + 180W for a total of 510W.

    CPU being limited to 119W still gives you 391W. GPU is limited to 180W so there's still 211W left over which doesn't make sense.

    Unless AW just wanted to show off, they'd need to increase that CPU ceiling.

    But we can't judge until a NBR member has one to put it through some real tests. ;)
     
    hmscott and raz8020 like this.
  26. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,737
    Messages:
    29,856
    Likes Received:
    59,711
    Trophy Points:
    931
    How can you be sure the notebook can utilize everything from both PSU? Is this confirmed? Nope. All I have seen they have stated is iGPU can be used... The smaller 180w is nice for max portability if you don't have to fire up nvidia graphics on the go.
     
    Last edited: Feb 3, 2019
    Ashtrix, Vistar Shook and raz8020 like this.
  27. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    They've said countless times in the videos that you can use one PSU (180W OR 330W) for potability. But for maximum performance you'll need to use both. SO it should be able to utilize both.

    When I asked Mr. Azor about 2x 330W PSUs, he said they'd work but you won't get any extra performance. SO either there is a 510W (330W+180W) limit or possibly a 330W x2 limit which is more doubtful.
     
  28. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,737
    Messages:
    29,856
    Likes Received:
    59,711
    Trophy Points:
    931
    As I said... This doesn't mean you will be able to utilize everything (all powa) from both PSU. You can use more power than from the single 330 in this dual psu setup but this doesn't mean they haven't added in a max power cap. Only a fully working mod unlocked firmware can confirm this.

    Remember 330+180w psu is equal +630W from the wall with 80% utilizion from PSU.
     
    Last edited: Feb 3, 2019
    Ashtrix, Rei Fukai and raz8020 like this.
  29. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,050
    Messages:
    11,278
    Likes Received:
    8,816
    Trophy Points:
    931
    Non GSYNC based models might have Hardware MUX based Optimus switching and GSYNC models will not have Optimus Prime. From the repair sheet I think AW 51M can draw power from both ports. Which DC-IN is functional is yet to be seen in review models. It would have been better to include single DC-IN port with Eurocom 780W PSU.
     
    Vistar Shook and Rei Fukai like this.
  30. VoodooChild

    VoodooChild Notebook Evangelist

    Reputations:
    519
    Messages:
    541
    Likes Received:
    1,014
    Trophy Points:
    156
    See, this is the problem I see with this system. Where are the reviews for the "fastest laptop in the world"?
    Azor was singing to everyone willing to listen at CES and afterwards that "this is the fastest laptop in the world" and "the first with a desktop class cpu" and this is "great legend design which they've changed only 5 times since the founding of AWs " and all that nonsense while we see Clevos again here rocking out numbers, and that is what matters. Numbers! Performance numbers.
    If this is truly a enthusiast class system, then where are the numbers for this? Surely, one good review would be out by now don't you think?
    I, for one, think that they're still figuring things out and when all is said and done, it'll be over to us guys here at the forums to fix our individual systems (hopefully not again) to repair what should have been working in the first place. That's what we pay premium for such systems, isn't it?
    I see no way that 9900K isn't reaching 99° while just gaming at 1080p. It's upto Dell to prove me wrong and they still haven't showed me the numbers which matters and they'll not have a penny of mine until proven otherwise. I am skeptical nonetheless.

    Sent from my SM-G965F using Tapatalk
     
  31. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,354
    Likes Received:
    70,777
    Trophy Points:
    931
    Just don't give them anything they can use against us like they have in the past.

     
    Ashtrix, Vistar Shook, GTVEVO and 6 others like this.
  32. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist®

    Reputations:
    37,255
    Messages:
    39,354
    Likes Received:
    70,777
    Trophy Points:
    931
    I would interpret that to indicate that it is dropping to 95W, which means it is throttling to non-turbo clock speeds. Intel TDP now is always non-turbo (base clock). They do not publish TDP for turbo clocks, as it can vary by CPU and system. It cannot hold turbo clocks under load at 95W because that is not enough power. It may be due to the GPU being under load with cancer firmware castrating the CPU when the GPU is under stress. They started that nonsense with the Alienware 18. Rather than supply the power it needs to run at full performance, they cut the performance back to keep the power draw at a predefined maximum they deem to be adequate.
     
  33. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    Ah, that was a sad story indeed. This is why I tend to not share anything like this with anyone except close people that won't say anything. Manufacturer's seem to just block out everything they can.

    There was a member here getting P5000 SLI to work and we never learned how since the person said nVidia would block it which is true.

    I watched most of that video (skipping here and there) and it at least seems the GPU is running well. It seems as long as the GPU was kept at a decent temp, it stayed at its turbo clock (no crazy fluctuations like I saw on my MSI 1060).

    CPU ran hot and throttled. The youtuber mentioned 2070+9700K heatsink being used on 9900K+2080 config so we'll see in a few weeks if that was true.
     
    Last edited: Feb 3, 2019
    Ashtrix, Vistar Shook and Johnksss like this.
  34. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,372
    Trophy Points:
    431
    I dont see why they cant get a 8700k running in there, that would actually make sense.

    Im a laymen though.
     
  35. iron_megalith

    iron_megalith Notebook Geek

    Reputations:
    5
    Messages:
    82
    Likes Received:
    27
    Trophy Points:
    26
    I've been wondering as well as why there is so much radio silence for this unit. If it really is an issue with the parts availability I suspect this could have affected review units as well which would mean they would arrive late.

    Just a speculation.
     
  36. iron_megalith

    iron_megalith Notebook Geek

    Reputations:
    5
    Messages:
    82
    Likes Received:
    27
    Trophy Points:
    26
    I'm getting mixed answers when I asked them about this. Umar said no but another rep said yes...... An 8700k would really be a reasonable choice for this unit. At this point I'll wait till someone tries them out.
     
  37. Cass-Olé

    Cass-Olé Notebook Evangelist

    Reputations:
    728
    Messages:
    338
    Likes Received:
    986
    Trophy Points:
    106
    I saw this tweet & this n' the other day & forgot to bring 'em in here; anybody see these?, or need to know this?
    [​IMG]
    GM tries to nix idea of buying now with intentions of LCD panel upgrade later (so why wait / just buy now !)

    Obviously if they'd designed it to be upgradeable they can sell you the 1080p now & the 1440/4k later & everybody wins; surely they knew 1080 wouldn't cut it in the long run & that owners would upgrade down the road, so why not prep the way now for easy mods later or at least try? ... c ui bono?

    "figured I'd do it meself, was able to on past alienwares!" ... & still got no love from the GM, heheh
    :rolleyes:
    Indeed
    *Note my prior post regarding CPU/GPU/heatsink swaps, Azor's tweet said those were worthy of a new How-To Video on AlienTube; a panel swap he just says "no", make of it what u will
    yeah, old ones
     
    Last edited: Feb 6, 2019
  38. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,372
    Trophy Points:
    431
    If they can be put on, they can be taken off.

    Hes basically saying its unlikely to still be under warranty if you do it yourself but without actually saying to deflate the hype for the machine.
     
    Vasudev likes this.
  39. gthirst

    gthirst Notebook Evangelist

    Reputations:
    97
    Messages:
    342
    Likes Received:
    231
    Trophy Points:
    56
    I'm on my 4th redone order now since Dell CSR keeps messing it up. I've spent a few hours on the phone total and multiple e-mails. If I didn't want this unit so bad I would have certainly cancelled. They are sending me another confirmation tomorrow with order details. If it is wrong on absolutely anything I'm done with them forever.

    In short this is what happened:
    1st order: wrong ram and HDD options
    2nd order: wrong screen and case color
    3rd order: same price as first order, but with downgraded options
    4th order: to be determined

    I've also come to terms with having the 1080p screen. The RTX 2080 will be the sweet spot I suspect for many new AAA games at 1080p/144. Anthem benched on similar specs (on the demo, which is wrought with problems) doesn't hit 144 AFAIK. Hopefully it is a very nice panel and I won't feel the need to upgrade it later.
     
  40. HaloGod2012

    HaloGod2012 Notebook Virtuoso

    Reputations:
    766
    Messages:
    2,066
    Likes Received:
    1,725
    Trophy Points:
    181
    The screen is what made me decide
    Most against it. I love extremely bright screens, and the only 400 nit screen around is the 1440P 120hz gsync panels in the Alienware 17 R5 and the P775; they are gorgeous panels


    Sent from my iPhone using Tapatalk
     
  41. XxAcidSnowxX

    XxAcidSnowxX Notebook Consultant

    Reputations:
    85
    Messages:
    248
    Likes Received:
    203
    Trophy Points:
    56
    Strange how confident he's saying "no" .... Doesn't seem right to just out right say "no".... Why not? Did they do something stupid like glue the panel into the upper assembly?
     
  42. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    They used adhesive for the past couple of generations btu where always removable. I have exchanged screen panels in past 3 generatiosn fairly easily including my current AW15R3. Just get a new adhesive strip afterwards after cleaning the old one off and you will be fine. Its extra work ofcourse compared to just exchanging a full screen assembly but it is certainly doable.

    BUt in the current service manual they onyl show how to remove the whole assembly so dont be surprised if they did something totally different this time. In previous models their service manuals they did show how to remove the panel itself.
     
  43. cn555ic

    cn555ic Notebook Deity

    Reputations:
    149
    Messages:
    917
    Likes Received:
    470
    Trophy Points:
    76
    Sweet spot is 1440p at 120hz for RTX2080 not 1080
     
  44. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    1440p 120hz 100% AdobeRGB panel would be perfect

    I personally wouldn't mind a 2160p 100hz 100% AdobeRGB panel either but I'm probbaly the 0.001%

    Oh OLED would be nice too. AW 13 R3 still has one of the best displays I've used aside from the 17.3" 1920x1200 RGB display used in the M17xR2 and Dell Studios.
     
    propeldragon, Terreos and Kuro Kensei like this.
  45. Kuro Kensei

    Kuro Kensei Notebook Consultant

    Reputations:
    115
    Messages:
    168
    Likes Received:
    231
    Trophy Points:
    56
    You could upgrade it but it's not easy, and that's true:

    https://www.techinferno.com/index.php?/forums/topic/617-m18x-anti-glare-mod/

    http://forum.notebookreview.com/thr...-the-m18xs-matte-display.599542/#post-7772260

    Possible? Yes. Drop in swap? Hell no!
     
  46. rickybambi

    rickybambi Notebook Consultant

    Reputations:
    8
    Messages:
    110
    Likes Received:
    79
    Trophy Points:
    41
    This is what holds me back from purchasing a desktop replacement. I want to be able to swap screens and not just that, I want it to be quick and easy. As an example, if I'm playing a FPS it'd be awesome to have a high refresh rate screen but after I'm done with the gaming session, be able to within 30 seconds change out to a 4K screen for media consumption.
     
  47. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    Well...even in this situation it seems the whole LCD assembly is removable. I'm sure you'll see on ebay some assemblies in the future.
     
    Rei Fukai likes this.
  48. gthirst

    gthirst Notebook Evangelist

    Reputations:
    97
    Messages:
    342
    Likes Received:
    231
    Trophy Points:
    56
    Yeah, but portability is still huge for some of us. That said, there is nothing stopping you from plugging into a different screen. I plan on doing that with the 51m. Playing games will mostly be on my lap with the high refresh rate, but I still have a projector for movie night!

    I have extra screens though wherever I plan on transporting to though, so I've been thinking of switching to the MSI Trident X Plus, which has a 2080 ti and is a mini PC. It is around $3000, but will no doubt have better performance than even a full spec'd 51m.
     
  49. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,469
    Likes Received:
    12,881
    Trophy Points:
    931
    Ummm, not sure where that was posted about it running 400+ watts?
    That ran up to 350W max. So far.

    Never never that!!!!
     
    Vistar Shook, Mr. Fox and ssj92 like this.
  50. alexnvidia

    alexnvidia Notebook Deity

    Reputations:
    434
    Messages:
    1,386
    Likes Received:
    622
    Trophy Points:
    131
    Actually that happened when someone requested him to load CPU ONLY. He said it's pointless to stress test cpu only because his bios was buggy and it would throttle to 95W but he did it anyway just to satisfy his viewers. I'm guessing he was filming it live. and early on, before he stress test CPU only, he did a CPU + GPU stress test and the CPU held it's promised TDP which was around 120W. so i have no reason to not believe his firmware was buggy.

    That part was easy to understand. The hard part to understand was about when someone asked him about the memory speed stuck to 2400.
     
    Last edited: Feb 3, 2019
    ssj92 likes this.
← Previous pageNext page →