The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    *OFFICIAL* Alienware "Graphics Amplifier" Owner's Lounge and Benchmark Thread (All 13, 15 and 17)

    Discussion in '2015+ Alienware 13 / 15 / 17' started by Mr. Fox, Dec 10, 2014.

  1. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    Both the port and the cable need to have 16 lanes for the PCIe connection to be x16. This would require a redesign (slight or heavy) for both the laptops, X51's back, and GA.
    An adapter won't fix things as, well, you'll still be limited to 4 lanes.
     
  2. Dan.J

    Dan.J Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    16
    Trophy Points:
    16
    Being Pcie 3.0 @ 4x, it doesn't really seem to hurt performance even on high end cards from what I see. I am on a GTX 980ti and it performs on par with what is normal for the card given what benchmark or game or whatever it is being used for. I guess what i'm saying is that it doesn't seem like the limited pcie lanes limit the performance much if at all even on high end cards.
     
  3. Lordofdeath

    Lordofdeath Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5
    I heard there's upto 40-50% difference in performance between GA and actually desktop counterpart. That's a lot. You will have variation upto 80 fps. That's a lot of money wasted on a high end desktop gpu. Or am i getting it wrong?
     
  4. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    You probably heard reports from the AW13 R1 with i5-4210u. That's completely different from, say, and AW15 R1 with i7-4710HQ or AW17 R3 with i7-6820HK.
    Remember that the CPU an plays a role in total game performance. The quad-core i7 CPUs in the 15 and 17 laptops should be sufficient for the GPUs. The dual core CPUs in the 13 and 15 R1 (i5-4210h) will struggle. Notebookcheck did a brief on the GA with the 13 R1 ( i5-4210u and i7-5500u) and 15 R1 (although they are a bit dated). You can also read my GA performance analysis by hitting the link in my signature.
     
    Last edited: Nov 28, 2015
  5. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    definitely not correct. Performance difference between desktop and laptop is currently around 5-10 percent at worst.

    I would imagine this may become a bigger issue with more powerful cards, but I highly doubt it will make a hige difference due to the way data transfer happens. It's only going to affect perfromance at times where performance saturates available bandwidth. This doesn't impose an upper limit on performance as much as it will reduce available performance at the most demanding times.
     
  6. CODVash

    CODVash Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    Hey guys. I got my Alienware 17 r3 as soon as it came out in oct. I just got the GA and a pny Gtx Titan. I gotta say it did not live up to my expectations. I think my built in 980m did just as good at getting fps in games like black ops 3. Does anyone else have this problem? Is it really because of the cable or is it just because the difference between a 980m and a Gtx Titan are not that big?
    Thanks
     
  7. Dan.J

    Dan.J Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    16
    Trophy Points:
    16
    Its not the cable. If you bought a "regular" Titan then yes performance should be about the same as your 980M. I run a 980ti in my amplifier and play Black ops 3 as well, Big difference over 980m.
     
  8. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    De Titan is pretty much performing similar as 980M.
     
  9. CODVash

    CODVash Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    I believe it is the Titan x. The black one with 12 gb of mem
     
  10. Dan.J

    Dan.J Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    16
    Trophy Points:
    16
    Then something isn't right but it isn't the cable. It sounds like your games are still running off the 980M and not the Titan x, probably a driver issue.
     
  11. CODVash

    CODVash Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    [​IMG] http://imgur.com/AEK4cLe
    this is my setup, I might be wrong about it being a titan x. But the laptop recognizes it as titan x and the drivers say they are up to date. maybe I just seriously overestimated its power? It just doesnt run as fast as i thought it would. would overclocking help? it might also be the game black ops 3 which is the only one i have tested it on so far.
     
  12. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    Test some other games. Blops is not well optimized at all, and heavily CPU dependent (I recently went from 1080p to 1440p and found blops gave me almost the same framerate on my 970m)

    Once you do this, run a couple of benchmarks both on your 980m and the titan and post them here.
    I'd also like to see some GPUz captures if you can, so we can see the voltage and temps of the titan in-game.

    Lots of things could be wrong, and something almost certainly is.
     
  13. CODVash

    CODVash Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    I am not too familiar with benchmarking but I would love to try to see what exactly is going on. So I guess ill dl GPUz, anything else I should DL or use? would overclocking help at all?
    Thanks everyone
     
  14. mertymen2010

    mertymen2010 Notebook Consultant

    Reputations:
    5
    Messages:
    213
    Likes Received:
    26
    Trophy Points:
    41
    Im thinking there may be a thunderbolt e-gpu released at some point. there is alot of talk about it. As much as i want the aga, my instincts tell me to wait as there may be a better solution on the horizon. I think aw kinda messed up only making it pcie x4, although people say it dont matter, it does to me. kind of annoyed as i spent a small fortune on this laptop. Still a fantastic machine though
     
  15. CODVash

    CODVash Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    I am mostly annoyed right now that I spent a small fortune buying the GA and a Titan x hoping that it would perform much much better. It doesnt seem like it was worth the money over the 980m already in the laptop. hopefully its just the game thats the problem though
     
  16. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    To be fair, you have one of the most powerful laptop GPUs at a time when the gap between mobile and desktop is the smallest it's ever been. Yours is not exactly the intended use case for the AGA, however you SHOULD still be seeing an increase.

    What resolution are you running at?
    A titan is not going to stretch it's legs unless you're pushing it. If you can run some benchmarks and post back, we can figure out where your issue is.
     
    rinneh likes this.
  17. CODVash

    CODVash Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    http://gpuz.techpowerup.com/15/12/02/c66.png
    so this is what gpuz spits out when i run MGS5 at 4k rez and everything on extra high except for the post processing which is off.
    i get a steady 23 fps. is that normal? I saw a video of a guy getting similar fps but with 8k rez.
     
  18. CODVash

    CODVash Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    http://gpuz.techpowerup.com/15/12/02/9rv.png
    lowering rez to 2048x1152 gets 33 fps. even typing this is laggy as the game is running in the background
    I believe some things like the gpu load are dropping to 0% when I alt tab out.
     
  19. CODVash

    CODVash Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    http://gpuz.techpowerup.com/15/12/02/4yk.png
    one last cap of a rez of 1980x1200 and fps of 34ish

    I am trying fallout 4 now, and amazingly, with everything turned up on ultra (except for low on godrays and medium on shadow distance) as well as TAA best antialias and 16 samples +max view distance. I am getting a solid 60 fps http://gpuz.techpowerup.com/15/12/02/atk.png

    sorry this is my first time doing this. I changed gpuz to show highest readings. That should be easier to analyze? http://gpuz.techpowerup.com/15/12/02/uzg.png

    Overall I feel like the dedicated memory usage is low when the titan x is supposed to have 12gb but is only using 3-4gb. Or is that normal?
     
  20. CODVash

    CODVash Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    http://gpuz.techpowerup.com/15/12/02/upt.png This was Metal Gear Solid 5 again with geforce optimization, which had everything on extra but rez at 1980x1200. still stuck at around 35 fps, is it just the games or what haha
     
  21. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    The memory usage is probably normal. Very few games will come close to use titans full compliment (it's not really designed for gaming)

    The framerates definitely seem very low. What CPU are you running? It looks like it could be a CPU bottleneck.
    Can you also check in GPUz on the main screen that it's reporting the connection as PCIe 3.0?

    Some models of AW are stuck on 2.0 if you don't have a BIOS update.
     
  22. CODVash

    CODVash Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    http://imgur.com/uCHkEgS
    this is the stats on almost everything. i do see that 2.0. is that really a problem? how can i fix that
     
  23. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    Update your BIOS and Alienware Command Center to latest versions.
    As well, uninstall your current Nvidia driver with DDU and install the latest from Nvidia (or a stable one at least).
     
  24. CODVash

    CODVash Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    thank you guys so much, that bios update is ridiculous, my settings now show pcie 3.0 X16 @ x8 3.0 instead of pcie 2.0 X16 @ x8 2.0.
    hopefully that makes a huge difference? currently updating ACC, but kind of worried about uninstalling drivers. it already has the latest one according to geforce experience
     
  25. CODVash

    CODVash Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    update: my settings went down again to pcie 3.0 X16 @ x4 3.0 instead of pcie 3.0 X16 @ x8 3.0. i dont know why or how, or even if that x4 instead of x8 makes a difference?
    metal gear with all high settings and a bit higher rez that was getting about 32 fps is now solid 53
     
  26. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    Yeah, i believe that's correct. And those numbers sound much better.

    Now it won't be bottlnecking, play with your settings as you may find you can increase them further without impacting the framerate too much, or lowering them will show a bigger performance increase.
     
  27. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    It's supposed to be x4 since the cable and port only have 4 physical lanes.
    The mobile GPU, though, can use 8 lanes.
     
  28. CODVash

    CODVash Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    Thank you all again for your help. I seem to be getting better fps results. Still not as good as I had hoped but much better. I think others can learn from this. Update your Bios!
     
  29. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    So i finally got my AGA, set it up and have a fairly annoying issue.

    I cant turn off the laptop display.

    The gtx 980 is enabled and punping out to my 1440p monitor, but nvidia control panel only sees that monitor. The laptop display is still on and still the primary display.

    The intel driver sees only the laptop display and not the external.

    Surely im nt stuck with using the laptop display as my primary?

    I need to keep the laptop open due to space constraints...
     
  30. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    It is correct that it is pci express3.0 at 4x thats the limitation. But 3.0 4x is twice as fast as 2.0 4x. I now notice you have a titan x and not the prevous titan. You should expect around 25% to 30% more performance in general compared to the 980m inside your laptop. But the best way to test this is with a game like The Witcher 3 which will be using your gpu fully. MGS5 and Black Ops 3 already run excellent on a 970m which is a lot slower.
     
  31. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    Right click on background and select display settings.or press FN + F8.
    Select "Show desktop only on 2".
    Done. The laptop's display is turned off.
    The GPU in the GA does not control any monitor ran by the laptop/iGPU, only monitors connected to it.
     
  32. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    So Alienware has some charts to share. It involves the Alienware 17 R2 and the GTX 980 (and GTX 980m if you wondering).
    alienware-graphics-amplifier-performance-chart-tombraider_caldera-hsc.jpg alienware-graphics-amplifier-performance-chart-3d-mark-fire-strike_calde....jpg [​IMG]
    And a video (using the 15 R2 and GTX TITAN X, but no comparison, but they do have a comparison video through playing Final Fantasy XIV)
    ( ).
    As well, they'll be doing some live benchmarks with the Graphics Amplifier against the Area 51 R2 and Q&A about the Graphics Amplifier on their Twitch channel tomorrow at 5 P.M CST.
     

    Attached Files:

    Last edited: Dec 2, 2015
  33. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    No Dice. The intel driver doesn't see the external monitor at all. It thinks it's running single display.
    Becuase each one is seeing a different screen, I can't even rearrange the monitors to set one as primary.
     
  34. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    Then something's wrong on your end as those steps work for me.
    Of course the Intel driver doesn't see the other monitor. The iGPU isn't powering it; the GTX 980 is, and vice versa for the laptop's display.
    However, Windows can arrange the displays to your liking and set which one should be the primary one.
    Don't do any monitor assigning or rearranging through the Intel Graphics or Nvidia Control Panel unless you have connected more than 1 monitor to each respective source.
    Wait, now that I think about that part, I would think things would get messy. However, just use Windows's display manager. You should use the individual graphics control panels for configuring each monitor settings and such.
     
    Last edited: Dec 3, 2015
  35. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    Yeah, that did it.

    I had thought you were talking about going into the iGPU driver. Years of conditioning to never use windows display manager for any reason, I totally forgot it even existed.
     
  36. cruisin5268d

    cruisin5268d Notebook Evangelist

    Reputations:
    51
    Messages:
    487
    Likes Received:
    119
    Trophy Points:
    56
    Hey folks,

    Just got my AGA ($142, Dell MPP!) for my 17 R3. I'm a bit torn on which GPU to get - a 980 /980 ti now or get a 970 now and roll with th at until the new Nvidias GPUs come out. If they are as good as they appear to be it might be worth the wait - but I if I drop enough money for a 980 or 980 ti then I'd probably wait until prices drop on the new Nvidia line before making the upgrade.

    Ideally I want my monitor setup to be 3 g-sync monitors - currently I have two ASUS ROG Swifts - but they are the original gen. I got a killer deal for $540/each but like a fool didn't realize they weren't the new series. I was thinking about returning both of them to get one of the new ones (man, they are bloody expensive) and get a second new Swift once prices start to come down. I'm also open to a curved screen monitor but I have yet to find a g-sync model nor one aimed at gamers.

    So,
    1) Which GPU should I get? 970, 980, or 980 ti? Regardless of the series, any advice on the specific card would be appreciated as I've see lots of conflicting info as to which cards fit and which don't
    2) Are the new Asus Swifts worth the extra money? Should I trade in my two for one of them or just roll with what I have and add a third one down the road?
    3) Curved screen. Thoughts?

    Thanks in advance folks. I've been out of the gaming world for many years and am playing catch up.
     
  37. MatthewAMEL

    MatthewAMEL Notebook Consultant

    Reputations:
    80
    Messages:
    128
    Likes Received:
    13
    Trophy Points:
    31
    I have a AW17R2 with 980M. I went with the 980Ti in my AGA. The 970 would've been about the same speed as the 980M.

    I have the EVGA 980Ti (not overclocked). I get around 12.5k in Firestrike and have no issues gaming in anything on Ultra at 2540x1440. (GTA V, Fallout 4, World of Warships, Project:CARS)

    G-Sync is worth it IMHO. Curved screen is a gimmick.

    Initially many, many headaches with the 980Ti and switching, but the last few NVidia drivers have fixed it. Switching between laptop/AGA now work with no issues.
     
  38. Punisher5.0

    Punisher5.0 Notebook Geek

    Reputations:
    0
    Messages:
    79
    Likes Received:
    16
    Trophy Points:
    16
    Yeah right now it's not worth putting a card in there that's less than a Ti. Today's mobile GPUs are amazingly powerful.
     
  39. cruisin5268d

    cruisin5268d Notebook Evangelist

    Reputations:
    51
    Messages:
    487
    Likes Received:
    119
    Trophy Points:
    56
    My thought with the 970 is it would give me multi-g-sync monitor support and I'd run that until the new NVidia cards come out. I'm not crazy about this idea -but don't want to throw money down the drain now on a 980 ti just to end up buying new card in 6-12 months. In actuality I'd end up keeping the TI for much longer.

    From what I see, it looks like the 980 ti can drive 3x 1440p screens without difficulty at high settings. I only have two now...so I'm thinking it should be able to drive 2 1440p screens maxed out on current games. Am I right or am I asking too much?

    And yes, I agree that curved screens are mostly gimmick but the idea of 3 of them wrapping around me is very groovy. I'm a bit OCD sometimes and having 3 flat screens awkwardly angled around my head is just....awkward.
     
  40. cruisin5268d

    cruisin5268d Notebook Evangelist

    Reputations:
    51
    Messages:
    487
    Likes Received:
    119
    Trophy Points:
    56
    If I go to 980ti route....which specific models fit? I'd like to stay away from the reference cards.

    I wonder if the EVGA card with the included water cooler would work and be long enough to replace the AGA fan. That would avoid me needing to splice wires to replace the noisy fan it came with.

    As I understand it for the Nvidia cards....Asus > MSI > Gigabyte > EVGA > Zotac. Is that order of overall quality + support + performance correct?
     
  41. kakashisensei

    kakashisensei Notebook Consultant

    Reputations:
    41
    Messages:
    217
    Likes Received:
    27
    Trophy Points:
    41
    That is not true. Each brand has their strengths and weaknesses. Each brand has their top end versions and their bottom end versions. For performance, you have to look at the factory overclocks (boost clock / mem clock). For overclocking potential, you have to look at the voltage/power regulation (typically on the top end versions have good ones).

    EVGA has the best support/warranty. Zotac probably has the best value for its AMP! and AMP! Extreme versions. MSI Lightning I heard is one of the best top end versions. EVGA Kingpin and Classified are also supposed to be very good, but significantly more expensive. In terms of cooling, there isn't that significant of a difference as long as its not the stock nvidia blower style cooler. All the custom coolers have lots of heatpipes and multiple fans. You can adjust fan profiles/ loudness via 3rd party software. For coil whine, its random depending on the specific card sample and I'd expect it with the stock GA power supply.

    If you want to get one that fits without modding the case, you will have to sacrifice your options. Pretty much any version that has extra card height (typically for better voltage/power regulation) or bigger heatsinks won't fit. Heatsinks that take up more than 2 slots definitely won't fit.

    Based on what I've seen, these are the cards that will fit without issue:
    Any stock 980TI with the nvidia blower style cooler
    Any EVGA 980TI with the ACX cooler (hybrid coolers you have to mod the case)
    Possibly the Gigabyte G1
     
  42. cruisin5268d

    cruisin5268d Notebook Evangelist

    Reputations:
    51
    Messages:
    487
    Likes Received:
    119
    Trophy Points:
    56
    You make some good points, thanks for your overview. What is the coil whine?

    Every mentions modding the case. What type of work are we talking about? It was only $142 so I'm not super worried about damaging it - heck, its a piece of junk anyways. I have to literally yank the lid to get it to open.

    I want to stay away from the stock ones for some of the reasons you mentioned - namely performance. As far as the ACX cooler you mentioned - does that include the newer ACX 2.0? I did some research on that and it seems they are cooler and quieter.

    I looked at the G1. Looks pretty solid and has great reviews. And, by that I mean, it looks like its a beastly card after watching some YouTube videos.
     
  43. Dan.J

    Dan.J Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    16
    Trophy Points:
    16
    Nothing wrong with the "reference" cards from Nvidia, the blower style works GREAT to get heat out of the case. I have been in this game a long time and have had MANY more issues with non reference cards then the reference ones. And as far as overclocking goes the reference cards generally do just fine, for example my reference 980ti gets close to 1500mhz boost on the core with 110% power limit while staying around 70C with 70% fan speed gaming with heavy GPU usage. Just my opinion of course.
     
  44. TastefulTech

    TastefulTech Notebook Enthusiast

    Reputations:
    0
    Messages:
    49
    Likes Received:
    4
    Trophy Points:
    16
    GTX980 Alienware Graphics Amplifier FPS Test!(The Witcher 3) 4K

     
  45. Punisher5.0

    Punisher5.0 Notebook Geek

    Reputations:
    0
    Messages:
    79
    Likes Received:
    16
    Trophy Points:
    16
    That was a nice bump up with the GA. How did it run better in 1080 ultra compared to medium though?
     
  46. kakashisensei

    kakashisensei Notebook Consultant

    Reputations:
    41
    Messages:
    217
    Likes Received:
    27
    Trophy Points:
    41
    Coil whine is when the inductor coils reach a resonant frequency, and cause the coils to have micro vibrations. This causes a high frequency and very unpleasant buzzing sound. This is predominantly for high end graphics cards, since they require more power. It happens when the graphics card is under heavy usage, not when idle. In my experience, nearly all high end graphics cards have coil whine. Its just a matter of how loud it is. If you have sensitive hearing, you will notice. Most of the time, loud coil whine can be attributed to a subpar power supply providing "noisy" power to the graphics card. The default power supply in the graphics amplifier is pretty ****ty. I tried 4-5 high end cards, and they all had loud coil whine that I can easily hear 4 feet away from the graphics amplifier. I changed out the power supply with a higher end one and the coil whine is much lower, to the point where I cannot hear it if I close the graphics amplifier case and don't have my ear right next to it.

    Do note that pretty much all power supplies will have issues fitting in the graphics amplifier. They do not have matching power plug location and most high end ones are too long and prevents the lid closing. I had to use a SFX power supply and its not mounted, but it fits inside.

    For EVGA, the ACX/ACX 2.0 coolers will fit with no issues. Note the Classified and Kingpin are not ACX coolers.
     
    cruisin5268d likes this.
  47. kakashisensei

    kakashisensei Notebook Consultant

    Reputations:
    41
    Messages:
    217
    Likes Received:
    27
    Trophy Points:
    41
    Why does Medium perform much slower than Ultra?I think the guy switched Ultra and Medium labels.
     
  48. cruisin5268d

    cruisin5268d Notebook Evangelist

    Reputations:
    51
    Messages:
    487
    Likes Received:
    119
    Trophy Points:
    56
    Great explanation, thank you. My hearing has always been sensitive to high pitch / high frequency sounds and static buzzing or repetitive noises (such as my ex nagging) all drive be bonkers.

    For the Power Supply, do you remember the specific model? If that helps then I definitely see replacing it in my future.

    For the MSRP of $300 this should have included a decent power supply capable of producing clean and regulated power and the cases should have had some sort of quality control so they don't require high levels of force to open. And, last but not least, Dellienware should have made it a bit taller. Huge oversight on their part for designing this to not allow the most common models its customers would want to use.
     
  49. CSHawkeye81

    CSHawkeye81 Notebook Deity

    Reputations:
    194
    Messages:
    1,596
    Likes Received:
    175
    Trophy Points:
    81
    I might consider trying the AGA with my New Alienware 17 R3. Have they fixed the bugs and issues?
     
  50. nguyenquanavi

    nguyenquanavi Notebook Enthusiast

    Reputations:
    0
    Messages:
    17
    Likes Received:
    2
    Trophy Points:
    6
    Anyone get "windows 10 boot into black screen with mouse active"?

    Sent from my D5803 using Tapatalk
     
← Previous pageNext page →