The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    *OFFICIAL* Alienware "Graphics Amplifier" Owner's Lounge and Benchmark Thread (All 13, 15 and 17)

    Discussion in '2015+ Alienware 13 / 15 / 17' started by Mr. Fox, Dec 10, 2014.

  1. Jamessh1

    Jamessh1 Notebook Enthusiast

    Reputations:
    0
    Messages:
    17
    Likes Received:
    0
    Trophy Points:
    5
    I just want to confirm that there is only one version of the Alienware graphics amplifier and that I do indeed have the right port on the back of my laptop. My concern is that I thought I had an early R2 or late R1 model as I have a displayport on the back of my laptop. I bought it last September and it came with the 980M. Hoping a kind person that knows more can confirm the graphics amp will work on my system.

    https://goo.gl/photos/3ooK4L5KExabPXoA7

    [​IMG]

    my apologies as I can't seem to embed the image as expected. hopefully the first link is fully accessible as it shows the available ports in question.

    much thanks in advance to someone that helps!
     
  2. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    The GA works on both the Haswell (+ Broadwell for 13) and Sklyake models.
    You have the Haswell model.
     
    Jamessh1 likes this.
  3. Jamessh1

    Jamessh1 Notebook Enthusiast

    Reputations:
    0
    Messages:
    17
    Likes Received:
    0
    Trophy Points:
    5
    Much appreciated!

    The new pascal cards (1000 series) were mentioned above as possibly working with the GA. I was reading that a 550 watt PSU is recommended for the 1000 series so would that mean the GA May not be compatible (recommending a 375w card or less ).

    Any thoughts?
     
  4. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    When Nvidia/AMD recommends a PSU, they recommend a PSU based on a GPU + the rest of a computer. Since it's just a GPU, it's best to go by the TBP or Graphics Card power.
     
    Jamessh1 likes this.
  5. frizzywizzy

    frizzywizzy Newbie

    Reputations:
    0
    Messages:
    6
    Likes Received:
    0
    Trophy Points:
    5
    Anybody know if the gtx1080 or gtx1070 will fit into the graphics amplifier? Thanks.
     
  6. dannycao2412

    dannycao2412 Notebook Enthusiast

    Reputations:
    7
    Messages:
    21
    Likes Received:
    11
    Trophy Points:
    6
    Yes, the measurements is exactly the same as previous generation card
     
  7. Cass-Olé

    Cass-Olé Notebook Evangelist

    Reputations:
    728
    Messages:
    338
    Likes Received:
    986
    Trophy Points:
    106
    My 1st look at an external kooler mod:
    s-l1603.jpg
     
    guttsy likes this.
  8. dannycao2412

    dannycao2412 Notebook Enthusiast

    Reputations:
    7
    Messages:
    21
    Likes Received:
    11
    Trophy Points:
    6
  9. Cass-Olé

    Cass-Olé Notebook Evangelist

    Reputations:
    728
    Messages:
    338
    Likes Received:
    986
    Trophy Points:
    106
    "above find a photo / link, Ti + AGA currently for sale, which already has the mod done for its next owner; if another photo resides here - a claim says one does - then it may have its own twist on where one can make such a cut-out when desired".
     
    Last edited: May 9, 2016
  10. sirleeofroy

    sirleeofroy Notebook Evangelist

    Reputations:
    22
    Messages:
    506
    Likes Received:
    169
    Trophy Points:
    56
    I think the 550 watt recommendation takes into account a full desktop system running from the same power supply. The GTX 1080 is reportedly rated at 180w so the standard AGA will be fine. Pending driver compatibility of course.....
     
  11. frizzywizzy

    frizzywizzy Newbie

    Reputations:
    0
    Messages:
    6
    Likes Received:
    0
    Trophy Points:
    5
    Will function properly?
     
  12. guttsy

    guttsy Notebook Consultant

    Reputations:
    2
    Messages:
    110
    Likes Received:
    38
    Trophy Points:
    41
    When people talk of drivers for the AGA, are these merely INF mods of the standard Nvidia / AMD drivers or something more?
     
  13. dannycao2412

    dannycao2412 Notebook Enthusiast

    Reputations:
    7
    Messages:
    21
    Likes Received:
    11
    Trophy Points:
    6
    Chill out man, did i offend you ?, if so, sorry for that. And yes, i followed this thread since it was created :D
     
  14. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    Sadly, modding the .inf files actually does not help.
    When it comes to driver support, a the Dell .inf in an Nvidia driver has to have the ID for the GPU. So, for the GTX 1080 to work properly, a Dell ID needs to be in the Nvidia driver from the start to work.
    AMD drivers don't really suffer from this problem. A couple of their GPUs, though, do have some issues, but thankfully, it's not the hyper-majority.
     
    guttsy likes this.
  15. DeeX

    DeeX THz

    Reputations:
    254
    Messages:
    1,710
    Likes Received:
    907
    Trophy Points:
    131
    So to avoid reading the 137 pages of things there I have a few questions...

    1. Does the graphics amp run the card at x16?
    2. Is the PSU upgradable (via a mod or hack)?
    3. Is the performance the same if using the internal laptop LCD?

    Anything else worth mentioning for someone considering one of these.
    I dont really have the dire need but it might be something cool to consider. :p
     
  16. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    1. No. At x4 since that's the physical limitation.
    2. Yes.
    3. No; from my analysis, about a 10% drop on average.
     
  17. SaigonBrit

    SaigonBrit Notebook Guru

    Reputations:
    5
    Messages:
    50
    Likes Received:
    8
    Trophy Points:
    16

    Hey there, my setup is almost identical to yours:
    Alienware 17 R3 | 17" 1080p IPS | i7-6700HQ | 32GB | GTX 980m 8GB | 256GB SSD + 1TB HDD

    I have just ordered a graphics amp so thinking about a 980Ti (hopefully the cost will plummet now that the 1070/1080 have been announced) or possible a 1070 at a bargain price of USD399. Do you reckon the 1070 will be compatible with the AGA? (I know this has already been addressed in the thread above but you seem knowledgeable and I'd like your opinion). Would make sense to buy a 1070 'cos it's around or slightly above a TITAN X in terms of GFLOPS if I understood correctly.

    Thanks a million.
     
  18. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    I would wait till the 1070 or 1080 will come out. I dont think the 980TI prices will plummet except at the used market.

    We have to wait for an AGA update though that will support these cards. I hope Dell will be quick with releasing that.
     
  19. SaigonBrit

    SaigonBrit Notebook Guru

    Reputations:
    5
    Messages:
    50
    Likes Received:
    8
    Trophy Points:
    16
    Thanks so much for this. I appreciate it :)
     
  20. Leonard92

    Leonard92 Notebook Guru

    Reputations:
    0
    Messages:
    73
    Likes Received:
    1
    Trophy Points:
    16
    okay now to look for the graphics card...
     
  21. tinay09

    tinay09 Notebook Enthusiast

    Reputations:
    0
    Messages:
    16
    Likes Received:
    1
    Trophy Points:
    6
    Help! Would DSR work with an aw17r2 with a GA+strix 970?

    I couldn't find the option to enable DSR? Just wanted to check if it's just me.
     
  22. zergslayer69

    zergslayer69 Liquid Hz

    Reputations:
    62
    Messages:
    1,551
    Likes Received:
    91
    Trophy Points:
    66
    Thought someone mentioned a while back DSR isn't supported on the amp?
     
  23. MatthewAMEL

    MatthewAMEL Notebook Consultant

    Reputations:
    80
    Messages:
    128
    Likes Received:
    13
    Trophy Points:
    31
    Not you. Not supported by AGA.
     
  24. gamerchick27

    gamerchick27 Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    I have a Alienware 15 with an i7 processor and a graphics amplifier with a evga 980 ti sc inside. When I run the official oculus rift compatibility check tool it says my i7 -4710HQ doesn't meet the minimum requirements of i5-4590 or higher. I'm guessing this is obviously an error? Also it says my graphics card is not ready. It says I am using Intel hd graphics 4600. I'm using a 980 ti sc!!?. It runs any game on ultra 4K at around 30-60 fps. Please help me. I really want to buy the rift. Will it work and why is it telling me it won't? I had to save money as a waitress for months to save up for all this...
     
  25. guttsy

    guttsy Notebook Consultant

    Reputations:
    2
    Messages:
    110
    Likes Received:
    38
    Trophy Points:
    41
    Your best bet might be the Oculus forums or /r/oculus?

    At least one person reporting success with the AGA and Rift: https://www.reddit.com/r/oculus/comments/4fler3/rift_cv1_working_fine_with_alienware_13_and/
     
  26. xbouncingsoulx

    xbouncingsoulx Notebook Enthusiast

    Reputations:
    7
    Messages:
    46
    Likes Received:
    14
    Trophy Points:
    16
    The AGA works well with the rift. I got mine a week or so and setting it up was no problem at all (same system spec as you). The 4710 has a usage of approx. 40% while playing Eve Valkyrie, so I'd say it should be plenty for a year of VR gaming... or maybe even two.

    PS: the Rift is pretty fu**ing awesome! :-D Didn't expect it to be that awesome!
     
    hmscott likes this.
  27. ha1o2surfer

    ha1o2surfer Notebook Evangelist

    Reputations:
    75
    Messages:
    556
    Likes Received:
    121
    Trophy Points:
    56
    DSR would work but I'm pretty sure you will have to NOT use optimus. Meaning the Intel GPU can't be used to display the internal display at all even if you're not using it for gaming. Optimus is what interferes with DSR.
     
  28. gamerchick27

    gamerchick27 Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    I have an external monitor. ASUS 1080p 1 ms response time... kicks butt, hun, so I have this to ask, should I use that with the Oculus rift? Should I use the AGA, or should I use the laptop? Where should I plug it in. I'm too girly for this
     
  29. gamerchick27

    gamerchick27 Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    Also the gtx 1080 has some amazing benchmarks. I want to make sure that my graphics card will last me for a long time. The 980 ti SC is great but I wish I didnt buy it right before this came out. The MAIN question is will the CPU of my laptop make the purchase of the AGA completely pointless? I spent a lot of money on this rig I made. How long will it last, sorry if I sound like I am PMS'ing but I dont want to feel like I wasted my money on high end graphics card/AGA and a good monitor. I have a laptop because I travel the US.
     
  30. xbouncingsoulx

    xbouncingsoulx Notebook Enthusiast

    Reputations:
    7
    Messages:
    46
    Likes Received:
    14
    Trophy Points:
    16
    You have to use the HDMI-out of the 980 ti, otherwise the rift wouldn't recognize the external graphics card.
     
  31. xbouncingsoulx

    xbouncingsoulx Notebook Enthusiast

    Reputations:
    7
    Messages:
    46
    Likes Received:
    14
    Trophy Points:
    16
    I think the 4710HQ is a nice match to the 980ti... the graphics card is probably even more capable that the CPU, so don't worry about it too much.
    The only question you should ask yourself is: Do I wanna play games in higher resolutions than 1440p? If yes, go for the GTX1080.
    Otherwise the 980ti will do the job for pretty nice framerates... and yes, I'm talking about more than 60 :) The GPU won't be the bottleneck.

    PS: ...and my personal opinion is that even the 1080 won't be able to deliver 60 frames at 4K... but I think we all will know a bit more about that when the first benchmarks drop.
     
  32. gamerchick27

    gamerchick27 Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    Still I am glad I have one of the best cards available. I had the titan x ordered but cancelled the order to save a TON of money when the superclocked 980 ti had practically equal benchmarks (from extensive research).

    So is everyone saying to ignore the stupid rift compatibility checkers and that they are wrong sometimes?
     
  33. xbouncingsoulx

    xbouncingsoulx Notebook Enthusiast

    Reputations:
    7
    Messages:
    46
    Likes Received:
    14
    Trophy Points:
    16
    they aren't wrong, it's just the way they work... Especially the Rift compatibility check. It doesn't measure performance it just compares your CPU to a list where the 4710hq - or any other mobile CPU - isn't listed. That's it.

    Try the Steam VR performance check and you'll get the answer you're expecting, as it actually measures performance ;)
     
  34. gamerchick27

    gamerchick27 Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    Everyone has been so helpful. Thank you.

    Two last unrelated question. Why did my frame rate drop down to 12-18 on fire strike ultra 4K benchmark? I am hoping to game 4K at least 30 fps. My rig is the 980 ti superclocked aga with the i7 Alienware 15 laptop. I need to know this before purchasing my 4K monitor.

    I was so pissed that they are coming out with the GTX 1080 a month or so after buying this. Is it really that much better? Did I screw myself buying the 980 ti sc so soon before this new card is released?
     
  35. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    If you can return it. But its not that much slower. About 20~35%

    4K is still sometimes a bridge too far though even for the 1080.
     
  36. john green

    john green Notebook Consultant

    Reputations:
    0
    Messages:
    121
    Likes Received:
    33
    Trophy Points:
    41
    Agreed. 4k makes huge demands on resources, and not always in obvious ways.
     
  37. gamerchick27

    gamerchick27 Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    I cant return it and now I am having anxiety about it. I spent $650 on the 980 ti... I wish I knew. The 980 ti superclocked is about 10% faster than the 980 ti regular I noticed in benchmarks, so maybe only 10% slower???
     
  38. DeeX

    DeeX THz

    Reputations:
    254
    Messages:
    1,710
    Likes Received:
    907
    Trophy Points:
    131
    Where did you buy it from?
     
  39. DeeX

    DeeX THz

    Reputations:
    254
    Messages:
    1,710
    Likes Received:
    907
    Trophy Points:
    131
    Where is the best place to pick up a GA on a good deal?
    Does the outlet have these?
     
  40. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    eBay usually has the cheapest price; otherwise, it's straight from Dell.
    I skimmed through the outlet, and it don't have the GA.
     
  41. Hatch3t

    Hatch3t Newbie

    Reputations:
    0
    Messages:
    7
    Likes Received:
    2
    Trophy Points:
    6
    Hey All,

    Just wanted to share with everyone what a NZXT closed-loop/AiO looks like set up in/on the AGA. This set up utilizes the Gigabyte GTX 980 Xtreme Gaming 4GB OC card, outfitted with NXZT's Krakken G10 bracket and Krakken x31 AiO LC. Without further ado:

    [​IMG] [​IMG] [​IMG] [​IMG] [​IMG] [​IMG]

    If there's anything anyone would like to know about what was used to accomplish the build, or if there are any inquiries into how it was carried out, let me know :)
     
    Last edited: May 20, 2016
    hmscott likes this.
  42. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    @Hatch3t Your images aren't showing up.
    EDIT: Now they are.
     
    Last edited: May 20, 2016
  43. guttsy

    guttsy Notebook Consultant

    Reputations:
    2
    Messages:
    110
    Likes Received:
    38
    Trophy Points:
    41
    Images showing for me. Very nice modding. Does the VRAM not require a heatsink on these cards?
     
  44. Mobius 1

    Mobius 1 Notebook Nobel Laureate

    Reputations:
    3,447
    Messages:
    9,069
    Likes Received:
    6,376
    Trophy Points:
    681
    Most likely he's going to fry the VRM and VRAM on the long run. The kraken bracket doesn't seem to have a heatsink for the critical components.

    Also, you can't use the GA if the box is not closed?
     
  45. guttsy

    guttsy Notebook Consultant

    Reputations:
    2
    Messages:
    110
    Likes Received:
    38
    Trophy Points:
    41
    Perhaps he's relying on the fan mounted near the rear of the card for VRM cooling and the air it would move in the general vicinity for VRAM cooling?

    The first images show the AGA closed, though?
     
  46. Hatch3t

    Hatch3t Newbie

    Reputations:
    0
    Messages:
    7
    Likes Received:
    2
    Trophy Points:
    6
    Hi guttsy,

    Appreciate the complement :D Wasn't sure about that either, and NZXT documentation for the bracket mentions nothing about using a plate/heatsink for their product. I am myself a bit worried about it (hence the upgraded fan to a Noctua), but will just have to monitor temps and see how things go. Other forums state it shouldn't be an issue...and now that I think of it, not sure what would honestly fit between the bracket and the VRAM.
     
  47. Mobius 1

    Mobius 1 Notebook Nobel Laureate

    Reputations:
    3,447
    Messages:
    9,069
    Likes Received:
    6,376
    Trophy Points:
    681
    Airflow without a heatsink to increase surface area is near useless.

    No no, I mean, can you run the graphics amplifier without closing the lid?
     
  48. Hatch3t

    Hatch3t Newbie

    Reputations:
    0
    Messages:
    7
    Likes Received:
    2
    Trophy Points:
    6
    Hopefully not, but then why would NZXT market this product for use with NVIDIA cards if it does not cool VRM/VRAM adequately enough (I upgraded the fan to be doubly sure of it)? They have this product available for a wide number of cards, and even state the ability to option aftermarket heatsinks should the need arise. If the temps are higher than I am comfortable with, I'll go the route of purchasing one. We'll have to see.

    The case does close, I didn't supply a pic of the rear, so here is one:
    [​IMG][/IMG] [​IMG]
     
  49. Mobius 1

    Mobius 1 Notebook Nobel Laureate

    Reputations:
    3,447
    Messages:
    9,069
    Likes Received:
    6,376
    Trophy Points:
    681
    But if you leave the GA open, will it still run?
     
  50. guttsy

    guttsy Notebook Consultant

    Reputations:
    2
    Messages:
    110
    Likes Received:
    38
    Trophy Points:
    41
    You could use something like these or these and cut / bend them down as necessary to give you some peace of mind?
     
    Last edited: May 20, 2016
    Hatch3t likes this.
← Previous pageNext page →