The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    *OFFICIAL* Alienware "Graphics Amplifier" Owner's Lounge and Benchmark Thread (All 13, 15 and 17)

    Discussion in '2015+ Alienware 13 / 15 / 17' started by Mr. Fox, Dec 10, 2014.

  1. MatthewAMEL

    MatthewAMEL Notebook Consultant

    Reputations:
    80
    Messages:
    128
    Likes Received:
    13
    Trophy Points:
    31
    I have not tried that, but the latest version of DDU has a 'disable Windows automatic driver update' setting and I have used that since July.
     
  2. Dan.J

    Dan.J Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    16
    Trophy Points:
    16
    I don't think so, doesn't sli still require a bridge between cards? I haven't messed with sli in ages as I just simply don't like how it works and even though frame rate is higher it just feels like rubbish to me. Maybe things have improved though since I have messed with it.
     
  3. Dan.J

    Dan.J Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    16
    Trophy Points:
    16
    With the new DDU does that setting only effects updates for the graphics card or does it effect all updates?
     
  4. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    AMD have removed the use for a bridge, so it's certainly possible without one (the GA would need to integrate the communication somehow)
    The bigger problem would be with bandwidth, as I think the PCIe lanes used are the same ones the internal GPU uses?

    The SLI problem you refer to is "microstutter" or frame latency, which is something both AMD and Nvidia have worked very strongly to fix in the last year or so. It's now virtually gone, and assuming a good SLI profile and GPU utilization, dual GPUs are pretty much as good as a single one in that regard.
     
  5. kakashisensei

    kakashisensei Notebook Consultant

    Reputations:
    41
    Messages:
    217
    Likes Received:
    27
    Trophy Points:
    41
    I'm using a 980 ti overclocked to ~1450 core (actual boost) / 8000 mem. If I recall, I was getting ~18500 3dmark13 gpu score using external monitor, and ~16500 gpu score using the internal screen. This is with i7 4710hq.
     
  6. xbouncingsoulx

    xbouncingsoulx Notebook Enthusiast

    Reputations:
    7
    Messages:
    46
    Likes Received:
    14
    Trophy Points:
    16
    Do you mean GPU score or overall score? Maybe you are mistaking it with your firestrike scores? But 18500 would be too low even for the GPU firestrike score...
    If you are comparing your overall scores I'd like to know your XTU settings and your physics and combined scores :p
     
  7. raiden87

    raiden87 Notebook Evangelist

    Reputations:
    46
    Messages:
    341
    Likes Received:
    123
    Trophy Points:
    56
    U guys with an Amplifier:

    I got my AW17 R3 and i observed something strange... My PCI-E Bandwith is only PCI-E !!!!2.0!!! x4 not PCI-E 3.0!
    Does some1 got the same? U can look it up in GPU-Z.
    Is this because the AW17 R3 shares its PCI-E Port between GA and PCI-E m.2 SSDs??
    If this is true, its just ********... again halving bandwith...
     
  8. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    Please upload a GPU-Z screenshot. Just to keep track of what laptops are affected.
    If anyone has a 13 R2 or 15 R2 and a Graphics Amplifier with a desktop GPU, upload a GPU-Z screenshot as well. I want to see if the PCI-E 2.0 x4 bandwidth is lineup wide.
     
  9. raiden87

    raiden87 Notebook Evangelist

    Reputations:
    46
    Messages:
    341
    Likes Received:
    123
    Trophy Points:
    56
    Here is mine
    ga2.0pcie.JPG

    AW17 R3
    6820HK
    PCI-E SSD

    I wonder if that bandwidth would go up if we just use the sata port or if we put a sata m.2 ssd in it?!

    Edit:
    At least it doesnt affect gaming much. My 3DMark Score stayed the same (a bit higher due to better CPU)
    http://www.3dmark.com/3dm/8904699?

    Edit2:
    So i just tried to run the laptop without any SSD in the m.2 slot... still the same bandwidth. So it has to do with the design of the mainboard i think. Nothing we could change in any way.
    @Alienware-L_Porras could u let the techs know? Maybe there will be another revision of the mainboard... This is definitely a step backwards :(

    Edit3:
    So maybe i was a bit fast. Just asked Frank Azor via Twitter about that Problem. He said last gen has been pcie2.0 too. I cant exactly recall if thats true for quadcore aw15 or aw17.
     
    Last edited: Oct 14, 2015
  10. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    I could have sworn the 15 was supposed to be x8?

    x4 @2.0 doesn't seem like it could possibly be enough bandwidth for a high end card?
     
  11. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    When using the GA, only four lanes are used. The internal Nvidia/AMD GPU uses 8 lanes (for the 15 and 17; 13 still uses 4 lanes due to CPU limitation).
    This is a physical restriction on the PCI-E port and cable.
    (Oh, and remove the @ in front of 2.0 because there's a member called 2.0, and he'll be receiving an alert that you tagged him.)
    @raiden87 No, the 15 R1 and 17 R2 was at PCI-E 3.0 x4 when using the GA.
    [​IMG]
     
    Last edited: Oct 14, 2015
  12. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    x4 3.0 is more reasonable, as would equate to x8 2.0

    But I feel the need to check my 15 now, as if its 2.0 x4 it's definitely going back. No way that wouldn't bottleneck a high end card.
     
  13. raiden87

    raiden87 Notebook Evangelist

    Reputations:
    46
    Messages:
    341
    Likes Received:
    123
    Trophy Points:
    56
    I have no Problem with pci-e 3.0 x4 since there isnt any real world difference... But at 2.0 i am pretty sure we will see some games which run slower with GA (gtx 980) than without (if u got an gtx980m).
    @Game7a1 die u took that Screen by urself? i am a bit disappointed now :( since the Skylake CPUs run so much cooler than haswell.. It would have fixed the main issue of prev gen. Now there are new ones... Cant decide which is worse.
     
  14. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    The average drop from PCI-E 3.0 x8 to PCI-E 2.0 x4 with the same CPU is 13% at 1080p, and the difference drops at higher resolutions. So no, the GTX 980m won't pull ahead in performance compared to the GTX 980 in the GA in some games (especially since the GTX 980m is closer to the GTX 970 than to the GTX 980). However, it does mean that the GPU performance from PCI-E 3.0 x4 is between 8 to 9%, but if the CPU is strong enough, the difference may not be noticeable or present. You can read more about PCI-E scaling on the GTX 980 here, if you haven't already.
    It also means that getting a GTX 970 if you have a GTX 980m is ill advised with the new laptops (to be honest, I always thought it wasn't the best idea with the older laptops. It's almost like getting a GTX 960 for the GA when you had the GTX 970m in the laptop or an R9 285 when you have the R9 m295x).
    The image came from the review I linked. I don't own a 15 R1 or 17 R2.
    I'm more inclined to think that the limitation is partially due to the Thunderbolt 3 port with its PCI-E 2.0 x4 bandwidth (the PCI-E SSDs can also be a factor). And because of the lower bandwidth, the new Alienware laptops have two ways of getting the same eGPU performance, through the GA port or through the TB3 port, if the support is there.
     
    Last edited: Oct 15, 2015
    judal57 likes this.
  15. raiden87

    raiden87 Notebook Evangelist

    Reputations:
    46
    Messages:
    341
    Likes Received:
    123
    Trophy Points:
    56
    Yeah ofc! I dont think there are much games where it will happen but if i Look at the link you posted WoW could be a good candidate for it. The scaling is pretty hard (x8 3.0 vs x4 2.0). I could imagine that the gtx980m performs better in this particular case.

    If i read the article and your comments right, the difference will be more if the Game is CPU demanding?
     
  16. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    If all else fails, get an AMD GPU. I hear they have better PCI-E scaling than Nvidia GPUs (if you have a good CPU, of course. If you don't, like in my case, then Nvidia's the only option).
    I do hope Alienware finds a way to reverse this downgrade for the new laptops as they all support PCI-E 3.0. Possibly through disabling the TB3 bandwidth or something similar.
     
  17. raiden87

    raiden87 Notebook Evangelist

    Reputations:
    46
    Messages:
    341
    Likes Received:
    123
    Trophy Points:
    56
    Got news for our problem: there was a last minute issue with the GA port i guess so they were forced to go for gen 2.0! In a couple of weeks there will be a bios fix for this. But at first only for nvidia cards and on AW15/17. AMD and AW13 will stay at gen 2.0.
    twitter.JPG
     
  18. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    I would really like an explanation from Alienware for AMD GPUs not getting PCI-E 3.0. Something about that doesn't make sense. If the port becomes 3.0, how can it be GPU selective? I know that the HDMI port is 1.4 because AMD GPUs don't support HDMI 2.0, but that's about it.
    As for the 13 R2 not getting PCI-E 3.0, it seems as if it's getting the BIOS-shafted again. Sucks, but that's what happened with the 13 R1.
     
  19. raiden87

    raiden87 Notebook Evangelist

    Reputations:
    46
    Messages:
    341
    Likes Received:
    123
    Trophy Points:
    56
    i thought the same thing. But maybe it has something todo with optimus, which always was a pro for eGPUs. I am not that deep into this tech stuff :/ For the AW13 i dont have an answer.. i can just imagine thats because of the thunderbolt port and/or because of the pcie m.2 slots
     
  20. kakashisensei

    kakashisensei Notebook Consultant

    Reputations:
    41
    Messages:
    217
    Likes Received:
    27
    Trophy Points:
    41
    Yeah that is low. Maybe I was mistaken. Gonna recheck after work.
     
  21. ttfid

    ttfid Notebook Geek

    Reputations:
    0
    Messages:
    97
    Likes Received:
    7
    Trophy Points:
    16
    I'm so glad I waited for the early adopters to test out the AW13 for me and report the problems and if they were fixed or not.
    This happens with every company, and every product due to greed.
    I guess 'test engineering' is a lost field of work.
     
  22. Welshmousepk

    Welshmousepk Notebook Consultant

    Reputations:
    1
    Messages:
    182
    Likes Received:
    11
    Trophy Points:
    31
    Glad to hear there will be a BIOS fix for the PCIe issue. Hope this is very soon.

    Frank talking about negligible difference is very disingenuous. It's important to remember that many people expect the AGA to last several generations, it's sort of the whole point.
    If it's bottlenecking a high end card now, it will be bottlenecking a mid range card next gen. x4 3.0 is already pushing it, 2.0 would be completely unacceptable.
     
  23. Dan.J

    Dan.J Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    16
    Trophy Points:
    16
    Is Alienware usually good about doing what they say with this kind of stuff like fixing this issue with a bios update? Considering sending mine back before my 30 days is up if it isn't addressed before that time as this is a deal breaker to me.
     
  24. raiden87

    raiden87 Notebook Evangelist

    Reputations:
    46
    Messages:
    341
    Likes Received:
    123
    Trophy Points:
    56
    mh cant really say something to that... in the m17x r times i think they where good at fixing stuff. But if we look at the latest problems with fan tables... i wouldnt say so :/ i hope they do it fast :D frank promised it to me :D
     
  25. S.O.L.O.

    S.O.L.O. Notebook Enthusiast

    Reputations:
    5
    Messages:
    42
    Likes Received:
    2
    Trophy Points:
    16
    can someone confirm if the pcie ssd affects performance of the graphics amplified?
     
  26. raiden87

    raiden87 Notebook Evangelist

    Reputations:
    46
    Messages:
    341
    Likes Received:
    123
    Trophy Points:
    56
    They doesnt affect it...
     
  27. Zorrobyte

    Zorrobyte Notebook Enthusiast

    Reputations:
    2
    Messages:
    17
    Likes Received:
    5
    Trophy Points:
    6
    I have a Sapphire 390 that is 12" long. I know the official spec is 10.5" however, would it fit anyway? My Corsair 380t case says up to 10.5 however the 12" 390 just fits. Pics: https://goo.gl/photos/ApKDND1CREzX15qU7

    Does the 390 in general work fine? Anyone else take a dremel to their graphics amp yet?
     
  28. judal57

    judal57 Notebook Deity

    Reputations:
    274
    Messages:
    1,164
    Likes Received:
    650
    Trophy Points:
    131
    the Answer is the processor, less PCI express configurations

    i7 6700hq = Up to 1x16, 2x8, 1x8+2x4

    i76500u = 1x4, 2x2, 1x2+2x1 and 4x1
     
  29. Dark_

    Dark_ Notebook Consultant

    Reputations:
    154
    Messages:
    124
    Likes Received:
    35
    Trophy Points:
    41
    Has anyone swapped the PSU out with another? I'm asking for the sake of 'will it work', not because I think a GPU requires more power.

    My thoughts are to yank out the 'brain' board and fit it to a much smaller chassis and use an SFX PSU to help keep the size down.
     
  30. Zorrobyte

    Zorrobyte Notebook Enthusiast

    Reputations:
    2
    Messages:
    17
    Likes Received:
    5
    Trophy Points:
    6
    Logic says you should be able to. Just keep in mind how Dell setup PSUs to have common connectors but fried equipment if not using their PSU in the mid 2000s. I seriously doubt that they do this anymore, I'd still break out the multimeter though..

    I don't own a graphics amp yet but as long as you can switch the PSU on, you should be gold. I believe it's green wire to black to kick it on, find a pinout :)
     
    Dark_ likes this.
  31. Dark_

    Dark_ Notebook Consultant

    Reputations:
    154
    Messages:
    124
    Likes Received:
    35
    Trophy Points:
    41
    Thanks. From the pictures it looks like a standard ATX 20/24 pin connector to the PCIE board, 2x 6/8-Pin connectors for the GPU and possibly a 12v for the fan (if it's not connected directly to the PCIE board). Mine shows up tomorrow so i'll take a look at it.

    Next steps are to come up with a chassis design that will shrink the AGA down yet still support a full length GPU.
     
  32. xbouncingsoulx

    xbouncingsoulx Notebook Enthusiast

    Reputations:
    7
    Messages:
    46
    Likes Received:
    14
    Trophy Points:
    16
    I changed the PSU. Works flawlessly.
    You can see the pictures a few pages back.
     
  33. kakashisensei

    kakashisensei Notebook Consultant

    Reputations:
    41
    Messages:
    217
    Likes Received:
    27
    Trophy Points:
    41
    It was ~20k firestrike gpu score w/ external lcd, and ~16.5k w/ internal lcd. This is with a 980ti w/ boost at 1440-1450mhz, 8000mhz mem.
     
  34. MSGaldenzi

    MSGaldenzi Notebook Deity

    Reputations:
    109
    Messages:
    1,251
    Likes Received:
    85
    Trophy Points:
    66
    I still need to comb through the 74 pages but can anyone really quickly answer my question. When you put an AMD card in the graphics amplifier slot, do you need to install the drivers on the notebook? How does this affect things when you go back to the built in gpu? I am assuming that it will automatically detect that you want to run from the external card and apply the AMD drivers you have installed and then when you reboot after unplugging it notices you want the Nvidia drivers to be active and start with those but I just wanted a confirmation from someone here who has tried this and possibly does it on a semi-regular basis. I just snagged an AW17 and got an amplifier on ebay for about $100 and the only card I have in the house that I am not using is an AMD card and just wanted to see how simple it was get it all working. Sadly the card is an R9 270 so its actually slower than the GTX 970m but I figured I'd give it a whirl anyway and upgrade the card in the amplifier down the road seeing as the 970m decent enough for the low amount of gaming I do anyway.
     
  35. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    Once you insert the R9 270 (AMD GPU) in the GA and plug the GA to the laptop, you reboot, and then you should install the latest AMD drivers (it won't do it for you). Once that's done, you're pretty much good to go. The Nvidia and AMD drivers will remain inactive if an AMD GPU or Nvidia GPU is present, respectively, so there won't be any, for a lack of better words, driver mishaps or missing GPUs happening, and the performance of both GPUs should be the same.
    I used to have an R9 285 (well, I still have it, but I'm selling it) and using my laptop (13 + GTX 860m) with the GA, swapping between the GTX 860m and R9 285 was seamless and mostly without problems. Didn't have to reinstall drivers, and updating drivers was fine as well.
     
    MSGaldenzi likes this.
  36. MSGaldenzi

    MSGaldenzi Notebook Deity

    Reputations:
    109
    Messages:
    1,251
    Likes Received:
    85
    Trophy Points:
    66
    Awesome! Thanks for the quick reply!
     
  37. MSGaldenzi

    MSGaldenzi Notebook Deity

    Reputations:
    109
    Messages:
    1,251
    Likes Received:
    85
    Trophy Points:
    66
    So I just ran 2 firestrike benchmarks for LOLs and Here is what I got. These are from a base AW 17r2 with an ssd.

    6566 - With the 970m
    5111 - With the Amplifier and an R9-270 from an xps8700

    I am suprised at how much better the 970m is, the fans didn't hardly kick in either. I still think the amplifier is pretty sweet, but I am thinking its going to be a while before I can put a worthy card in there. I may yank the GTX 960 out of my X51r2 and see how it performs, but from what I have seen... I think its going to be about the same as the 970m. (Actually I really should just run firestrike on my Desktop and compare it)
     
  38. grkstyla

    grkstyla Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    2
    Trophy Points:
    16
    Heaps of overclock performance to be gained on 970m also
     
  39. grkstyla

    grkstyla Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    2
    Trophy Points:
    16
    Does anyone know anyone/company that will do some modding to the amplifier in sydney NSW australia? I would like to make as much of the left panel (closest to videocard) perspex to show card, dont want it to cost too much though.
     
  40. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    So I did something crazy. I did a benchmark while on battery (note: it's the one with the lower score).
    http://www.3dmark.com/compare/3dm11/10461373/3dm11/10438282
    Aside from the CPU drop, the scores are pretty similar. It's kind of funny, but I do not think it can be replicated in the bigger laptops with i7 quad cores (I supposed the 15 with either the i5-4210h or i5-6300HQ could manage similarly in the sense that the CPU doesn't suffer much).
     
  41. Dan.J

    Dan.J Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    16
    Trophy Points:
    16
    I have a graphics amp question I'm curious about that isn't "laptop" related but figured maybe someone here would know. I have one of the new Alienware X51 R3's coming and plan to hook up a amp with a gtx 980ti in it, being this is a micro desktop and only hooked up to a external monitor I'm wondering if the X51's internal gtx 960 needs to stay in the machine being I really will have no use for it as the amp will always be hooked up to it? Just curious is all being its a little different setup then my 17R3.
     
  42. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    You'd probably be the first person to own both the X51 R3 and GA in this forum, so it's hard to say. If it's anything like the laptops, the GTX 960 will be disabled, so whether it stays in the computer is almost irrelevant. I say almost because if you want to, you could remove it in favor of better heat dissipation in the desktop if you don't have liquid-cooling.
    I mean, if you don't plan on ever using it, change the main display GPU in the BIOS to the iGPU (for now), then put the GTX 960 in an anti-static bag, plug in an HDMI cable to the iGPU (top HDMI port), and do whatever you want.
     
  43. Dan.J

    Dan.J Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    16
    Trophy Points:
    16
    Mine is coming with the water cooled cpu setup but was thinking of pulling out the 960 and selling it and figured it could only help as far as temps and such. I just wasn't sure if with the amp setup if there HAD to be a dedicated GPU in the X51, I guess I will just yank it and see once I have it.
     
  44. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    I wouldn't sell it for warranty purposes. But that's just me.
    I would play around with the set-up before deciding what is best.
     
  45. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    Note to whoever: don't plug the GA cable to the laptop while the laptop is in hibernation mode. None of the dGPUs will appear when doing so.
    Or don't leave the GA cable in while in hibernation if you're using the internal dGPU (this is possible by allowing the laptop to give a prompt to restart or, when desired, have the cable's button function be to restart).
     
  46. MSGaldenzi

    MSGaldenzi Notebook Deity

    Reputations:
    109
    Messages:
    1,251
    Likes Received:
    85
    Trophy Points:
    66
    Why not just put the 980ti in the X51r3? I know some people put the 980 in there without a hitch. I assume the steps would be similar to the ones posted on the desktop subforum here in the alienware section relating to putting a 970 in the X51r2. I am guessing you already considered this, but just letting you know that as far as I know, it can be done in case you didn't already know.
     
  47. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    The 330w PSU won't handle the computer + GTX 980 ti (a 250w GPU), and the size of the GTX 980 ti requires a fan mod, which will void warranty (so would a PSU mod).
     
  48. Dan.J

    Dan.J Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    16
    Trophy Points:
    16
    Yea, no way 330 watt is going to cut it for the whole system, I personally wouldn't do it with just a normal 980 and the 980ti is a big jump up in power consumption compared to it especially overclocked and on top of all that a 980ti would run HOT in that tight space no doubt.
     
  49. MSGaldenzi

    MSGaldenzi Notebook Deity

    Reputations:
    109
    Messages:
    1,251
    Likes Received:
    85
    Trophy Points:
    66
    Gotcha. I was unaware the TI was that much higher consumption over the standard 980. Either way, can't wait to see your results.
     
  50. ttfid

    ttfid Notebook Geek

    Reputations:
    0
    Messages:
    97
    Likes Received:
    7
    Trophy Points:
    16
    What are 'slim fit' computers used for? A 'tiny house' perhaps?
     
← Previous pageNext page →