The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    *OFFICIAL* Alienware "Graphics Amplifier" Owner's Lounge and Benchmark Thread (All 13, 15 and 17)

    Discussion in '2015+ Alienware 13 / 15 / 17' started by Mr. Fox, Dec 10, 2014.

  1. MSGaldenzi

    MSGaldenzi Notebook Deity

    Reputations:
    109
    Messages:
    1,251
    Likes Received:
    85
    Trophy Points:
    66
    They are good for entertainment centers and as a TV setup. Also for traveling, the X51 fits in larger backpacks... I do assume you were talking about the X51.
     
  2. MSGaldenzi

    MSGaldenzi Notebook Deity

    Reputations:
    109
    Messages:
    1,251
    Likes Received:
    85
    Trophy Points:
    66
  3. armymax

    armymax Notebook Geek

    Reputations:
    0
    Messages:
    75
    Likes Received:
    11
    Trophy Points:
    16
    I am going to buy an AW 15r2 full specs. I was wondering if the GA will be compatible with Oculus Rift, disabling Optimus and connecting it directly to the dGPU. Thank you
     
  4. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    It should be. I don't how it can't be.
    I think someone from AWA may have successfully used the Rift with the laptop + GA.
     
    armymax likes this.
  5. armymax

    armymax Notebook Geek

    Reputations:
    0
    Messages:
    75
    Likes Received:
    11
    Trophy Points:
    16
    Should I need to buy the laptop with the i7 6820hk or the 6700hq is future proof too? Does the GA have ports to connect it to a 4k 60Hz screen?
     
  6. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    1. The i7-6700Hq should be fine, but if you plan on overclocking or just owning the laptop for a long amount of time, the i7-6820HK is a better option.
    2. That all depends on which desktop GPU you get.
     
  7. armymax

    armymax Notebook Geek

    Reputations:
    0
    Messages:
    75
    Likes Received:
    11
    Trophy Points:
    16
    Overclocking I don't think or maybe in a very distant future. I will buy the amplifier when the new pascal comes out so I don't want CPU will be restrictive in the next years.
     
  8. grkstyla

    grkstyla Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    2
    Trophy Points:
    16
    Not happy with the standard cpu performance, I would definitely opt for the better cpu, and really how cou intensive is vr it is basically 2 monitors outputting at different refresh rates isn't it? how does oculus connect to a pc? Basic connections will be the same as if you had a desktop but certain features may be unavailable, for example I can't use dsr on a monitor connected directly to 980ti in amp, and forget about having full use of nvidia features when using laptop screen while on amp also, I'm interested how you go with this, keep us posted
     
  9. armymax

    armymax Notebook Geek

    Reputations:
    0
    Messages:
    75
    Likes Received:
    11
    Trophy Points:
    16
    From what I saw 6820hk is just about 4% faster than 6700hq but at the expense of a little higher heat. Overclocking is a possibility but the heat will be significantly higher. So I think I will take the 6700.
    About Oculus I know it is just a video output (with all motion sensors) and it has a 2160x1200 resolution at 90Hz.
     
  10. grkstyla

    grkstyla Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    2
    Trophy Points:
    16
    Wow, only 4%, maybe worth learning to overclock, because when connected to the amp, both fans are dedicated to the cpu so heat wont be an issue (idle is always high on these laptops though)
     
  11. Splintah

    Splintah Notebook Deity

    Reputations:
    278
    Messages:
    1,948
    Likes Received:
    595
    Trophy Points:
    131
    I have officially been ruined by high hz panels, went back to using my 17r2 today, and everything feels so choppy when I move it, damn you rog swift and gt72 with 75hz
     
  12. grkstyla

    grkstyla Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    2
    Trophy Points:
    16
    didnt think it was so much different, i used to use a crt with 110Hz and cant remember it feeling too different to my first lcd with 60Hz
     
  13. Splintah

    Splintah Notebook Deity

    Reputations:
    278
    Messages:
    1,948
    Likes Received:
    595
    Trophy Points:
    131
    it will probably take a few hours readjustment time, maybe a day, I didn't expect it to be so noticeable to me though
     
  14. teamdoa

    teamdoa Newbie

    Reputations:
    0
    Messages:
    6
    Likes Received:
    2
    Trophy Points:
    6
    I just bought a Alienware 17 r3 with a graphics amplifier. I have a gtx 980 ti in it at the moment and everything seems to work fine. The only thing i want to do is reduce the noise, so a future fan mod will be in the works.

    My question is regarding the powering up of the laptop with the amplifier installed. Is it normal when you switch on for the laptop and amplifier to come on for a few seconds, then switch off and then on again?

    Cheers
     
  15. raiden87

    raiden87 Notebook Evangelist

    Reputations:
    46
    Messages:
    341
    Likes Received:
    123
    Trophy Points:
    56
    For the R3 it seems normal afaik. My AW15 R1 started "normal" without going on and of twice (only if i hooked off the amp, started it and used it mobile and After that hooked it back on, it did a Short power on/off/on)
     
  16. teamdoa

    teamdoa Newbie

    Reputations:
    0
    Messages:
    6
    Likes Received:
    2
    Trophy Points:
    6
    Seems a little odd. I had a word with Dell and it seems to be normal

    I just ordered a Noctua NF-A9 FLX fan, so hopefully that improves the crazy amount of noise the amplifier puts out lol.
     
  17. MSGaldenzi

    MSGaldenzi Notebook Deity

    Reputations:
    109
    Messages:
    1,251
    Likes Received:
    85
    Trophy Points:
    66
    So I put that R9-290 in the amplifier and noticed some issues with heat. My games would run fine for the first 15-25 minutes but after that, they would drastically go down. I am talking about getting like 100+ FPS in battlefield hardline all the way down to 30 or so. The same thing happened with league of legends where I was getting around 400fps uncapped and it got down to 30fps in some battles mid-late game. Is this just due to heat from the card? I know the R9 290 is a hot card, but is that normal? I have not tried it in a desktop as I don't have one to plop it into but could it be something else? The card is clean and ran a 9172 firestrike score basically back to back. This is my first higher end amd card, I usually stick to nvidia and now I remember why.
     
  18. Dan.J

    Dan.J Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    16
    Trophy Points:
    16
    Are u running MSI afterburner or similar and watching gpu temps? Should be easy to see if its the card getting hot or not? My 980ti runs cool even overclocked and under heavy usage in the GA box but my 980ti is a blower design which pumps the heat out of the case and doesn't just dump into the case. I could easily run my GA without the fan in the front if I wanted I believe. Ill bet its something besides heat thermaling the card but you'll have to log/watch gpu temps to rule that out first.
     
  19. MSGaldenzi

    MSGaldenzi Notebook Deity

    Reputations:
    109
    Messages:
    1,251
    Likes Received:
    85
    Trophy Points:
    66
    It is a blower type as well. The card often runs at 95c and I think is rated to run around there... I didn't log it with afterburner but just watched the temps on the alienware software and it pretty much was right under 100c.

    I actually had the GA on a small side table beside me and it got uncomfortably hot with the air blowing on me.
     
  20. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    Did we ever get the BIOS update to allow PCIE 3.0 on the new skylake models with the AGA?

    I intend to buy an amplifier soon but haven't seen a BIOS update. If it's still PCIE 2.0 x4 i'm gonna be rather put out.
     
  21. Dan.J

    Dan.J Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    16
    Trophy Points:
    16
    The reference cooler 290 runs hot, and your plenty hot for sure it sounds. I would install Msi afterburner and enable user defined fan programming and make a fan profile that is pretty aggressive, I'm betting that will drop your temps considerably as my guess is the stock fan profile sucks and lets the card run super hot like that. Try it and see if your issues are better.
     
  22. MSGaldenzi

    MSGaldenzi Notebook Deity

    Reputations:
    109
    Messages:
    1,251
    Likes Received:
    85
    Trophy Points:
    66
    I will give that a shot, thanks.
     
  23. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    Jemplayer likes this.
  24. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
  25. raiden87

    raiden87 Notebook Evangelist

    Reputations:
    46
    Messages:
    341
    Likes Received:
    123
    Trophy Points:
    56
    GPU-Z still reports Gen2

    *EDIT* Frank Azor said that AWCC isnt the only thing thats needed for gen3! They will bring a BIOS-Update and a new (AMP??) Driver that are neccessary for it to work. ETA is next week if everything goes at plan
     
    Last edited: Nov 12, 2015
  26. Spoke Lee

    Spoke Lee Newbie

    Reputations:
    0
    Messages:
    6
    Likes Received:
    0
    Trophy Points:
    5
    I know g-sync is unavailable on my 17r3 due to optimus requiring the display to be directly connected directly to the dgpu. My question is does the graphics amplifier connect in such a way that the uhd igzo can run g-sync since or would it only work with a g-sync display connected to the display port on the amplifier?
     
  27. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    The laptop's screen isn't G-Sync ready, and even then, it's still controlled by the Intel GPU. You have to use an external monitor connected to the eGPU for G-Sync (or Free Sync).
     
    Last edited: Nov 15, 2015
  28. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    So I'm finally looking to order my AGA and a card (and a better monitor)

    Intend to run at 4k, but not sure I can splash for a 980 ti currently. Will likely have to get 390x or a 980 for now, and upgrade later. (or I could get a 980ti with the AGA, but keep running at 1080p until I can get a monitor, but that seems like the worse scenario)

    Does anyone have any specific insight in regards to the AGA and which card to choose? I know AMD cards were preferred due to driver issues, but I believe that problem has been solved?

    I think I'd prefer a 980 as they generally overclock more, but any advice will be appreciated.
     
  29. john green

    john green Notebook Consultant

    Reputations:
    0
    Messages:
    121
    Likes Received:
    33
    Trophy Points:
    41
    My advice (I have no idea what I'm talking about) is to get the AGA now--they are widely available for $200--and the best GFX card you know will work. Don't buy a 4k monitor now because the prices are falling so fast it's tough to catch them. Keep your current monitor (I'm watching the Seahawks game on an HP 2311 (HD) I paid $130 for five years ago). Right now an Asus reference 4k monitor costs $1200 and everybody's cooing about how CHEAP they are--1/4 of last year's price! The Asus consumer 4k is $450 and by the summer I'll bet reference 4k monitors are close to that price.

    Wild cards: Everybody knows the nVidia 1000x series is coming soon. Do you want the VERY best or the best now? Wild card #2: A consumer 4k has a lot to offer if you're not doing VFX in your living room. Like I said, I'm still not done with the cheap HP, but maybe I'll add in a cheap 4k and go with a 3-monitor setup? Only my credit card company knows for sure!
     
  30. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    When the new Nvidia cards come out, I'll upgrade.
    Prices in my country for tech are super high, but it means a fantastic used market. Usually I sell GPUs for around 90 percent what I paid.

    As for the monitor, I was planning to get a new xb281hk, which is roughly 3 times the cost of my 1080p monitor (doesn't seem so bad for 4 times the resolution plus g sync)

    I upgrade A LOT, so doing so doesn't bother me. I'm more concerned with getting the best out of what I can currently afford.

    It seems that a 980ti for 1080p would be overkill. I am leaning toward getting the 4k monitor and a 390x, since I can sell on the 390x just after new year for most of what I paid and then upgrade to a 980ti.
     
  31. nguyenquanavi

    nguyenquanavi Notebook Enthusiast

    Reputations:
    0
    Messages:
    17
    Likes Received:
    2
    Trophy Points:
    6
    I have an ext hdd plugged in AGA and only got around 38mb/s transfer speed while i tested using that hdd on A13 usb directly the result is 120mb/s. I had all the drivers updated, anyone has the same issue?
     
  32. Dan.J

    Dan.J Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    16
    Trophy Points:
    16
    The 980ti is overkill for 1080P unless your a FPS junky like me and want a constant 120-144fps on the latest games with high detail, then its needed. 390X is a solid card and should do fine at 4K as long as your ok with lower frame rates.
     
  33. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    Are you using an AGA with a 980ti?

    I'm curious to know how it performs for high refresh. Ideally, I'd like a 144hz 1440p monitor, but I was afraid that the combined issues of the laptop CPU and the PCI bandwidth would create a bottleneck at high frames.
    I would expect that you would see less of a perdormance impact running 4k60 versus 1440p144.

    Even when I was running a desktop with a 980ti, I found that dropping from 4k to 1440p would only take me up by about 20 frames.

    It would seem silly the get a 144hz monitor if I end up only getting 60fps, especially if I could get roughly the same framerate at 4k.
     
  34. MatthewAMEL

    MatthewAMEL Notebook Consultant

    Reputations:
    80
    Messages:
    128
    Likes Received:
    13
    Trophy Points:
    31
    I use a 980Ti in my AGA on a GSync monitor. Stock clocks on GPU, I run ~13k in Firestrike at 2560x1440. Game performance is excellent in current gen games.
     
  35. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    Can you provde a couple specific examples of average framerates in a few games?

    No need to go crazy with benchmarks. Just let me know your clocks (if OCd) and roughly how the frames are in a couple games I can compare to benchmarks.

    Also, what CPU are you running?

    EDIT: If the AGA really can push 144hz at 1440p im really heavily leaning toward a ROG swift again. I owned one previously and loved it.
    The advantage there is I can put of getting an external GPU for a little while longer, because a 970m should be able to push 1440p 50+fps on med-high settings, which will be perfectly playable.

    Whereas if I get a 4k ill either be dealing with non-native res or horrible framerates.
     
    Last edited: Nov 16, 2015
  36. MatthewAMEL

    MatthewAMEL Notebook Consultant

    Reputations:
    80
    Messages:
    128
    Likes Received:
    13
    Trophy Points:
    31
    Here is the output from Unigine Heaven:

    Unigine Heaven Benchmark 4.0
    FPS: 107.5
    Score: 2708
    Min FPS: 26.6
    Max FPS: 217.5
    System
    Platform:Windows NT 6.2 (build 9200) 64bit
    CPU model:Intel(R) Core(TM) i7-4980HQ CPU @ 2.80GHz (2798MHz) x4
    GPU model:NVIDIA GeForce GTX 980 Ti 10.18.13.5891/Intel(R) Iris(TM) Pro Graphics 5200 10.18.15.4248 (4095MB) x1
    Settings
    Render :Direct3D11
    Mode:1600x900 8xAA windowed
    Preset: Extreme

    60fps at Ultra/Max (2560x1440) in GTA V, Project Cars, MGS :pP, Fallout 4, Shadow of Mordor, World of Warships.

    I'd expect to need to drop a notch or two at 4K or run SLI to stay at Max. I'm interested to see how it handles Ashes of the Singularity. I have a relatively weak CPU.
     
  37. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
  38. nguyenquanavi

    nguyenquanavi Notebook Enthusiast

    Reputations:
    0
    Messages:
    17
    Likes Received:
    2
    Trophy Points:
    6
  39. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    Uhh... 13 R1 has 2.0. It would operate at 1.1 when in idle, 2.0 in load (in other words, don't freak out).
    This is a CPU limitation.
     
  40. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    Awesome. I've just ordered a AGA on ebay (but no actual GPU for it yet).
     
  41. raiden87

    raiden87 Notebook Evangelist

    Reputations:
    46
    Messages:
    341
    Likes Received:
    123
    Trophy Points:
    56
    i can confirm that on aw17r2 ga works now with gen3 on nvidia cards
     
  42. Punisher5.0

    Punisher5.0 Notebook Geek

    Reputations:
    0
    Messages:
    79
    Likes Received:
    16
    Trophy Points:
    16
    But does it take advantage of it?
     
  43. raiden87

    raiden87 Notebook Evangelist

    Reputations:
    46
    Messages:
    341
    Likes Received:
    123
    Trophy Points:
    56
    Should be around 7-10% but i didnt tested it
     
  44. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    So i've finally settled on a 390x for my AGA, both are on the way.

    I went with a sapphire 390x Tri-x. Has anyone used one? I'm concerned it might be a little long.
     
  45. ElCaptainX

    ElCaptainX Notebook Consultant

    Reputations:
    17
    Messages:
    282
    Likes Received:
    68
    Trophy Points:
    41
    Guys cpu of aw 17 6820HK can i you with it 4 to 5 year with amp and high end graphic in future? Without bottleneck
     
  46. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    Can anyone actually measure the available space for the GPU inside the AGA?

    THe quoted max length inside turns out to be an inch less than the new card I ordered. I know they are often conservative with these, so before I waste the time and money to return my card I'd like to know if it might fit. Once it's opened, I can't return it.
     
  47. DjId10t

    DjId10t Notebook Enthusiast

    Reputations:
    0
    Messages:
    48
    Likes Received:
    0
    Trophy Points:
    15
    I have the AGA with a R9 390, but I am not seeing it recognized when I do a 3D Mark test, it comes back as a generic VGA video card. A small little thing that I am curious too see if it is normal or if I have done something wrong. It is labeled right in the device manager, and the 3D Mark system scanner is up to date. Is this just a product of the AGA being used?

    Any insight will be helpful, thanks.
     
  48. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    It could be a driver thing. Before, my R9 285 would be registered as an R9 285 or R9 380 in 3DMark. Now, it's just Generic VGA.
    Generic VGA also isn't new (I got on both Nvidia and AMD at one point), but it doesn't affect the overall performance of the GPU.
     
  49. Splintah

    Splintah Notebook Deity

    Reputations:
    278
    Messages:
    1,948
    Likes Received:
    595
    Trophy Points:
    131
    As time progresses I am thinking more and more that my 17r2 was a fantastic decision. I can upgrade it as long as new desktop cards are being produced.
     
  50. Lordofdeath

    Lordofdeath Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5
    is it possible for alienware to release another edition of the amplifier that will be x16 speed and work with the current proprietary socket they designed? Or does the proprietary socket limit it to x4? or possible for a future thunderbolt egpu solution. I just purchased a alienware 15 inch with 980m 4gb, i7 6820k skylake processor, 16 gb ram. Will i be able to game with this at 1080p at mid-high settings for atleast 2-3 years?
     
← Previous pageNext page →