The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    *OFFICIAL* Alienware "Graphics Amplifier" Owner's Lounge and Benchmark Thread (All 13, 15 and 17)

    Discussion in '2015+ Alienware 13 / 15 / 17' started by Mr. Fox, Dec 10, 2014.

  1. grkstyla

    grkstyla Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    2
    Trophy Points:
    16
    Although the case is already too narrow the way its layed out, this sounds like a great mod,.

    I remember reading somewhere that actual Wattage needed from 980ti is much lower than TDP, I believe it was around 175W draw from power supply at full utilization but i cant find it anywhere, what dimension were you aiming for? you should keep us posted on progress

    My starting thoughts would be to put the case by itself on a power meter, and check power consumption with stock power supply etc that would give you an idea of current needed, and the perfect final product would utilize and external power supply like the 18's 330W for example, meaning the enclosure would be roughly double the size of a 3.5HDD case.
     
  2. S.O.L.O.

    S.O.L.O. Notebook Enthusiast

    Reputations:
    5
    Messages:
    42
    Likes Received:
    2
    Trophy Points:
    16
    This review shows TDP for 980 TI though it mentions that during a gaming test it pulled a maximum of "428.38W" which is higher than the TDP.

    http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-ti,4164-7.html

    And another possible way to put a power supply would be using a 1U, SFX, TFX, etc. with the GPU stacked on top, in this way a higher wattage PSU can be used while keeping the width about the same as the card, and making for better portability.

    Also, the length of the case could be reduced to 7" since new graphics cards using HBM are all likely going to be half length like the new fury series.

    [​IMG]

    [​IMG]
     
    Last edited: Sep 26, 2015
  3. grkstyla

    grkstyla Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    2
    Trophy Points:
    16
    finally someone that know what they are talking about, very cool idea, external powersupply or two of them like those modded 17's with dual 220W would me my preference, it would ensure the enclosure is very small, but i would leave enough length and width in the enclosure for future card swaps (unlike alienwares design which only allows reference coolers to fit)
     
  4. S.O.L.O.

    S.O.L.O. Notebook Enthusiast

    Reputations:
    5
    Messages:
    42
    Likes Received:
    2
    Trophy Points:
    16
    Thank You :)

    At this point, I think it might be worth while to just hold out for a few months since there will likely be some thunderbolt 3.0 eGPU's coming out which will possibly have thinner profiles. Also, there is rumor of Oculink making its debut in the fall of 2015 also, which Alienware has being rumored to be using with the Graphics Amplifier, though they probably got access to an earlier spec of Oculink which only supports x4.




    Oculink
    http://www.kitguru.net/components/a...alize-oculink-external-pci-express-this-fall/
     
  5. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    So I ended up spending a little extra and getting the 15 r2 with skylake, as with an i7 6700 and a 970m (and ddr4 RAM, which i didn't realize it came with) is actually only 200 bucks more than the similarly specced r1 I was intending to buy (despite it being advertised as 500 dollars off).
    I sold my desktop to pay for this though, and it looks like it will take 2 weeks to arrive. Need to find something else to keep me busy for a while.

    Really curious to know if skylake improves GA performance though, as it apparently Turbos more consistently due to better thermals. Sounds as though a bit of an overclock might be enough to get very close to desktop level performance.
    I won't be getting a GA until next pay month due to other commitments (tattoos are expensive..) so I hope the extra cost will have been worth it.
     
  6. grkstyla

    grkstyla Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    2
    Trophy Points:
    16
    Due to my current 17 R2 failing after mobo replace etc, I am pushing to replace with newer model or refund, I now am thinking because i do very light gaming when traveling (league of legends) should I just get the alienware 13 (latest model) and use my current GA with GTX980.

    Do you guys think there will be a big drop in performance? are the newer models better?

    I guess my question is, which option?

    1. Skylake 17 with GA 980
    2. skylake 13 with GA 980 (and save the difference for additional ssd or something)
    3. XPS 13 (will this run lol?) and a mATX desktop to put my 980 in (around the same budget with 6700K etc.

    Also the 15 has a graphics option of m395x which bumps cpu to a i7-6820HK, so that might solve the CPU bottleneck and driver issues with the GA i was having
     
    Last edited: Sep 28, 2015
  7. batman900

    batman900 Notebook Enthusiast

    Reputations:
    0
    Messages:
    25
    Likes Received:
    0
    Trophy Points:
    5
    How did you solve this? TY!
     
  8. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    I ended up getting the 15 over the 13 (despite my preference for the form factor) specifically because of the heavy CPU bottleneck. Games will still run with the GA 980, but you'll be getting much lower performance than you could. I also expect games coming out from now onwards will be much more CPU intensive than in the past now that were on primary dev for next gen consoles.
     
  9. grkstyla

    grkstyla Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    2
    Trophy Points:
    16
    It seems the cpu overclock while ga is plugged in was a big selling point, I wonder if anyone here has actually gotten that working, from my experience if the ga is plugged in and over clocking is enable the 4XXX cpu doesn't go over 2.5GHZ on overclock level 1
     
  10. grkstyla

    grkstyla Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    2
    Trophy Points:
    16
    Here you go, let me know if you have any questions

     
  11. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    Here's a question I don't see adressed anywhere;

    The GA seems to use the same PCI connection for the GPU and USB devices. Given the GPU is likely saturating most of the bandwidth, does having USB devices hooked up to the GA impact performance?

    If you have a USB soundcard, USB 3 HDD and mouse and KB hooked up to it, I imagine that could be a lot of data. If this impacts GPU performance then a separate USB hub may be the way to go.
     
  12. grkstyla

    grkstyla Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    2
    Trophy Points:
    16
    I run alot of usb 3 devices off the back of GA

    The GA cable feels like it has 2 separate cables in it when i squeeze it.

    Gaming doesnt affect usb 3 hardrive rate but the GA isnt necessarily fast for USB, i would see it as a glorified HUB going to a single usb3 port, I have seen over 250MB/sec going to USB 3 ssd's etc, but going to multiple 2.5" spinners, it cant max them.

    Also the Devices when GA is plugged in is limited, I have found that 16 is the max, for example if I have 16 devices plugged into GA's 4 USB 3 ports using HUB etc, the ports on laptop will fail with anything I plug into them.

    Another interesting this is that the Gigabit port on laptop seems to register as 2 different devices with and without GA, i.e. Killer #1 and Killer #2 when GA is plugged in, seems it has something to do with the rerouting also but I definitely havent seen an issue with bandwidth with that as i saturate almost all the time, but a couple times a week is will lock up alot freezing all network activity and require a reboot, tried different drivers etc and disabling killer bandwidth control, i just assume its not made for my workload.
     
  13. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    Awesome, thanks for the info. I intend to use a USB soundcard, mouse and KB, and potentially a couple USB drives (for media only) and wireless dongles. Hopefully that should all work okay.
     
  14. grkstyla

    grkstyla Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    2
    Trophy Points:
    16
    Yes that should work fine, far less demanding than my setup
     
  15. Dan.J

    Dan.J Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    16
    Trophy Points:
    16
    So I have a new 17r3 coming and have a GA that I plan to use. What seem's to be the best to use in these for things to work right without driver issues etc? Nvidia/AMD? I have a 980M coming in the 17r3 and plan to run gtx970-980ti in the amp to drive my external display.
     
  16. grkstyla

    grkstyla Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    2
    Trophy Points:
    16
    anything will work as long as you force dell 353.54 driver and block windows from updating, if you want the latest nvidia drivers than the laptop you ordered needed to have the r9 395M amd, so the drivers dont affect each other
     
  17. Dan.J

    Dan.J Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    16
    Trophy Points:
    16
    So I would be better off with a AMD card in GA by that logic then I suppose? Like 390x or fury I guess, what is known to fit? Thanks
     
  18. grkstyla

    grkstyla Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    2
    Trophy Points:
    16
    The only amd card I would pick for ga is the fury x and put fan at front to replace loud Alienware fan, I don't have one to see if it fits but I read somewhere that the 390x's don't fit and that the fury x and nano are smaller and should, don't know how the water block gets power, if it's powered off the cards power then there shouldnt be an issue
     
    Last edited: Sep 29, 2015
  19. Dan.J

    Dan.J Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    16
    Trophy Points:
    16
    Thanks, maybe the new nano would make the most since. I'm getting older and just like things to work for the most part, don't want a bunch of driver conflicts.
     
  20. grkstyla

    grkstyla Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    2
    Trophy Points:
    16
    Yeah, it would be reliable, do your homework on nano vs fury x performance though
     
  21. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    Nano costs more for less performance, I wouldn't bother. If an R9 fury will fit, it's a good choice.

    I'm planning to get a 970 and later on a 980 ti, but i'm not sure if i'll regret that choice with the driver issues.
     
  22. bumbo2

    bumbo2 Notebook Deity

    Reputations:
    324
    Messages:
    1,612
    Likes Received:
    104
    Trophy Points:
    81
    Don't buy Nvidia card for the GAP if you don't want problem!
     
  23. Dan.J

    Dan.J Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    16
    Trophy Points:
    16
    What is it exactly that causes the driver issues? I have wondered how it all works with the GA. So like in my case I will have the Intel on die graphics, I will have a mobile driver from Nvidia for the 980m and then I will also have a regular driver package for what ever card be it AMD or Nvidia thats in the GA? I could see how this stuff could bring issues for sure if this is how things are set up. I guess I will figure it out once my stuff is here. I would prefer to go Nvidia in the GA but not if its a pain to get to work right with the rest of my gear.
     
  24. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    If AMD had a card that wasn't awful, I'd gladly buy one.
    Here in NZ, Nvidia cards are cheaper than AMD At almost every price point. at the high end, AMD cant even touch them. the 980 ti is vastly superior to a Fury X and they cost the same.

    Once I get a card and start looking into it, i may be able to do something about the driver issues. I can't imagine it would require much more than some INF trickery to get them working. Not sure if anyone has really looked into it yet?
     
  25. Dan.J

    Dan.J Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    16
    Trophy Points:
    16
    You have a MSI 390x in your GA? Fits OK? What resolution do you play at and are you happy with performance? Thanks
     
  26. bumbo2

    bumbo2 Notebook Deity

    Reputations:
    324
    Messages:
    1,612
    Likes Received:
    104
    Trophy Points:
    81
    the msi 390x work fine on the gap but I have to live it open cose can't close it. I play the witcher3 max uot and when I said max out I mean nothing off.
     
  27. grkstyla

    grkstyla Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    2
    Trophy Points:
    16
    Driver problems won't be easy to fix

    It's a card detection issue only fixed by using old dell driver and forcing Windows update to ignore new drivers

    If you don't mind old driver, and don't mind missing out on the latest game optimisations then nvidia is fine and overclocking will outperform everything

    If you want latest drivers with no issues and not going to overclock the fury x is the way to go as its performance is between 980 and 980ti and only a couple fps behind the latter when both at stock clocks

    Another solution is to get the amd 395m card in the laptop that way you get much better cpu and can run anything you want in ga with any driver
     
  28. grkstyla

    grkstyla Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    2
    Trophy Points:
    16
    It's not a separate driver, if both internal and external card is nvidia you have one mobile driver handling both which fails to detect internal unless you use old dell driver

    Ati driver doesn't have this issue
     
  29. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    Sadly, modding drivers to add ID's won't work as it'll be as the old times, internal GPU doesn't work after GA unplug. Many have tried, and none have succeed. Of course, that may be because we haven't tried different ways of making it work. I assume all of us just modded the driver and boot normally.
    If the desktop GPU ID isn't in the Dell .inf file, then problems happen with detecting the onboard Nvidia GPU (and the desktop Nvidia GPU too). The graphics driver does not have to be from Dell (as sometimes, they're outdated).
    The problem discussed is more relevant on Win10 because of its forced updates. Win7 and Win8.1 don't really have forced-updates (although, some optional Nvidia drivers from Windows Updates can sneak by).
     
    Last edited: Sep 29, 2015
  30. Dan.J

    Dan.J Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    16
    Trophy Points:
    16
    Thanks guys, makes more since to me now.
     
  31. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    The 980 ti absolutely demolishes the fury when both are overclocked though. My previous 980 ti could get almost 30 percent more performance than stock once OCd. Here in NZ, the fury X is actually slightly more expensive than a stock 980 ti. When I am intending to play at 4k, I need all the performance I can get.

    What's the actual workaround for using new drivers for the desktop, is it to reinstall drivers any time you disconnect the GA? Not ideal, but doable if the drivers are kept somewhere easy to access.

    Starting to question if this was actually a good purchase now. I had gotten rid of my desktop and laptop thinking this would be able to serve both purposes.
     
  32. grkstyla

    grkstyla Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    2
    Trophy Points:
    16
    You have 2 options, either reinstall mobile driver after disconnecting from GA, or within 10 seconds of loggin in disable the internal m card, both will stop the BSOD
     
  33. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    I think I might go with a 390x to avoid the ********. Much better value proposition here in NZ. Costs only a little more than a 970 but a fair chunk less than a 980ti.
    Will probably not be able to do 4k on it though.

    Does anyone have experience pushing high refresh rates on a laptop CPU? I would expect the CPU limitation will be more apparent pushing 144hz?
    Worth bothering?
     
  34. MatthewAMEL

    MatthewAMEL Notebook Consultant

    Reputations:
    80
    Messages:
    128
    Likes Received:
    13
    Trophy Points:
    31
    Refresh rates are not a concern if you go with a g-sync or free sync monitor.
     
  35. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    They are if you want to push HIGH refresh rates though. The CPU is much more important at that, so if the CPU will be barely able to break 60fps regardless of the GPU, there'd be no point getting a HFR monitor.
     
  36. Dan.J

    Dan.J Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    16
    Trophy Points:
    16
    Which model 390X is known to fit the GA properly? So to clarify, if running a nvidia gpu in the laptop and a nvidia gpu in the GA then you run only the nvidia mobile driver and thats it. If running the nvidia gpu in the laptop and a AMD gpu in the GA then we will run a nvidia mobile driver for the laptop and then the AMD desktop drivers separate for the GA and this setup seems to be less prone to issues?
     
  37. MatthewAMEL

    MatthewAMEL Notebook Consultant

    Reputations:
    80
    Messages:
    128
    Likes Received:
    13
    Trophy Points:
    31
    Perhaps you aren't familiar with g-sync/FreeSync. They are designed to be variable refresh rate monitors and aren't constrained by a fixed rate.

    It's better in every circumstance.

    http://www.howtogeek.com/228735/g-sync-and-freesync-explained-variable-refresh-rates-for-gaming/
     
  38. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    I am very familiar, and have owned several G-sync monitors.

    My point is that getting a 144hz monitor is pointless if the laptop can only push 60fps. Regardless of whether you have g-sync or not.

    I would rather have 144hz than 60hz, but if the CPU limits the maximum framerate, i'd be better off sticking to 60hz and a higher res (4k) as the GPU is a more important factor for this.
     
  39. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    Pretty much. Only a couple old AMD GPUs have problems. However, AMD + AMD hasn't been tested in public (only heard from one person, and it wasn't good).
    I would get either an R9 290x or an R9 390 over an R9 390x. Better price and close enough performance. But if you want an R9 390x, then VisionTek's (usually a safe choice, like PNY for Nvidia), Gigabyte's (maybe this one too), and this PowerColor one should fit. There are more options from the R9 390 and R9 290x (and if you want to be cheaper, the R9 290 is also an option).
     
  40. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    Pricing structures in NZ are messed up, which always makes these decisions weird. The 390x is cheaper than the 290x.
    The 390x is only 100 NZD more than the 390. Performance wise, I think it's the right call. The 390x is already much less than I'd want (would get a 980 ti), but with the driver issues i'm forced to AMD, and the 390x seems to be by far the best price/performance card at that level.
     
  41. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    to put it into perspective, prices in NZ:

    980 ti: 1200
    Fury x: 1270
    Fury: 1035
    390x: 700
    390: 600
    970: 600

    So the 390x is waaay cheaper for than a fury for the performance it offers.
     
  42. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    I'm never going to understand pricing over at AUS/NZ.
     
  43. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    Tight border controls, and massive import levies and tax. Low population in NZ means no economies of scale.
    It always gets messed up.
     
  44. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    No, I mean how it gets so expensive. I know why, I just don't get how (I need the numbers to understand how).
     
  45. MatthewAMEL

    MatthewAMEL Notebook Consultant

    Reputations:
    80
    Messages:
    128
    Likes Received:
    13
    Trophy Points:
    31
    Nope. You still aren't getting it. Owning a technology is not the same as understanding how it works.

    Your monitor refresh rate and your GPU (not CPU), refresh rate are not the same thing. If the GPU can't provide a constant fixed refresh of 30 or 60 or 144Hz, then you had tearing (no v-sync) or stuttering (with v-sync).

    Both G-Sync and FreeSync eliminate this problem by varying the refresh rate.

    http://www.geforce.com/hardware/technology/g-sync/technology
     
  46. RatioKiller

    RatioKiller Notebook Enthusiast

    Reputations:
    0
    Messages:
    48
    Likes Received:
    10
    Trophy Points:
    16
    Hey guys got a question for you. So I am looking into buying a GA (I have the A15) but I am trying to lock down a good card prior to buying it. I know nvida is having some driver issues so I want to stick with AMD. I am looking at the new R9 Nano, but I haven't heard much about it, specifically in the GA. Anyone have one in there GA? Or does anyone have a recommendation?

    I have heard things about the hyper X but I am worried about it fitting in the case, does it? I dont want anything to be sticking out etc.

    Budget is anything goes, as long as its good.

    thanks.
     
  47. Dan.J

    Dan.J Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    16
    Trophy Points:
    16
    Both the Nano and the Fury X should fit the GA as they are pretty small cards, you would have to mount the cooler/fan setup on the Fury X obviously. I have been looking at them too but only way I would go with Fury/Nano is if I was going to be using a 4K display as at lower resolutions its not that great for the money. I'm looking at 290x/390x for myself, only play at 1080P but need 120/144hz panel. Honestly I would prefer Nvidia but have no desire to deal with a bunch of driver issues so AMD it is.
     
    RatioKiller likes this.
  48. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    I'm not sure if you're completely misunderstanding me or what. Forget g-sync for a second. I'm talking about the cpu bottleneck potentially limiting maximum framerate. G sync does not allow the machine to render more frames. If the cpu bottleneck limits the maximum framerate, then getting a g sync monitor isn't going to allow me to play games at 144hz.

    I'm not sure how you think g sync fits into this. I'm just talking about the fact that the cpu is more important the more frames you render, and so if you have a weak cpu that can't feed the gpu fast enough to get 144hz at decent settings, there would be no point getting a 144hz monitor.
     
  49. RatioKiller

    RatioKiller Notebook Enthusiast

    Reputations:
    0
    Messages:
    48
    Likes Received:
    10
    Trophy Points:
    16
    I wasn't actually planning on running a 4K setup, so I might just have to look into the 290x/390x you mentioned. Thanks for the suggestions.

    Does anyone happen to know what the bottleneck of the CPU's would be? Also I have the "Old" alienware 15 so i got the i7-4710HQ chip. I ask because I am all for spending the money for a nice graphics card but if my CPU isn't going to max it or near max it out, there is no point..

    Side note: In the future I plan to buy either the steam VR or Oculus Rift, and I would like a setup that could handle it well.

    Thanks
     
  50. Dan.J

    Dan.J Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    16
    Trophy Points:
    16
    I got a really good deal on a EVGA gtx 980 FTW so I guess I am going to cross my figures and hope for the best to not have driver issues. I am getting the i7 6820hk and am not worried about any cpu bottleneck with a single card whatsoever, I would think your 4710 would be fine as well, can't imagine it would be much of a issue if any really but I'm sure someone will chime in that knows for sure.
     
← Previous pageNext page →