The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Will the Graphics Amplifier bottleneck a GTX 1080 so badly that I'm better off getting a GTX 1070?

    Discussion in '2015+ Alienware 13 / 15 / 17' started by TareX, May 20, 2016.

  1. TareX

    TareX Notebook Geek

    Reputations:
    0
    Messages:
    81
    Likes Received:
    11
    Trophy Points:
    16
    Money's not a problem... I know it's overpriced now, but my question is, with the Amplifier's 4X PCIe connection and 32Mps bandwidth, will I see so minimal gains due to bottleneck issues that it doesn't make sense to get the 1080 over the 1070?

    I just want to play Project CARS all maxed out in VR comfortably over 90fps... and hopefully GTA Vive.
     
    Daniel1983 likes this.
  2. iunlock

    iunlock 7980XE @ 5.4GHz

    Reputations:
    2,035
    Messages:
    4,533
    Likes Received:
    6,441
    Trophy Points:
    581
    Great question. Although there are limitations of the bandwidth, it's not enough to make a noticeable impact on crippling the 1080 over the 1070 to effect your game play. If your pockets are deep, I'd just go for the 1080 without hesitating.
     
    Saveikis likes this.
  3. Saveikis

    Saveikis Notebook Consultant

    Reputations:
    32
    Messages:
    142
    Likes Received:
    42
    Trophy Points:
    51
    I agree with @iunlock , The limitations on the bandwidth are noticeable on paper when you are reading the specs, but in real life, you will barely see or feel the difference while playing any Triple A game or using your GPU as a utility to do some productivity work. I had the same thing with my older mother board and a newer gpu, where the GPU's bandwidth was higher than the mobo's... The end result was that I did not notice any differences at all. Maybe a few fps frop here or there, but you won't feel it, unless you will be doing some benchmarks. That also should be a close number as well :)
     
    iunlock likes this.
  4. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    If the AGA is going to bottleneck a GPU, then you don't want to get a slower GPU just because if it's gonna make a faster one go slow then it's also going to make a slower card go even slower - they both will get a performance hit running through the AGA :)

    In fact, if you wanted 1070 performance results, then you should get the 1080 to make sure you get at least 1070 results, or better.

    If money isn't the object, then build a desktop system with the new Broadwell-E, and 2 1080's, that should kick things up several notch's.

    Or, get the 1080 for the AGA... see if you can get one with Water Cooling, might as well kick things up a notch with what you have available.

    You can always move the 1080 with water cooling to your new desktop later :)
     
  5. TareX

    TareX Notebook Geek

    Reputations:
    0
    Messages:
    81
    Likes Received:
    11
    Trophy Points:
    16
    How does a 1080 with water cooling even fit into the AGA? Also, I'm only in North America for 3 weeks.... so my options are either the reference design NVIDIA 1080 (+$100) or the 1070... Will I be able to "add" liquid cooling later to the 1080?
     
  6. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    It's a bit of a chore, even with a "kit" decked out with all the parts and tools, but it can be done.

    You would get a closed loop radiator that would fit outside the case, I think there is one posted with photo's in this thread.

    Definately go for the 1080, it's got the speed :)
     
  7. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    The 980TI is hardly limited by the bandwidht of the AGA. The 980TI is actually moving more data on the PCI express but than the 1080. So i really doubt the 1080 would have a larger problem with the bandwidth not sufficient.

    The 1080 has a much stronger compression algorithm to still have in software higher bandwidth than a 980TI.
     
    iunlock and hmscott like this.
  8. orancanoren

    orancanoren Notebook Consultant

    Reputations:
    5
    Messages:
    243
    Likes Received:
    42
    Trophy Points:
    41
    Do you know how much performance loss there is with a 980 Ti in a AGA compared to a desktop? I was told that there is 5% to 10% loss but I couldn't find benchmark comparisons on this. Do you know any comparison results?
     
  9. webjeff

    webjeff Notebook Evangelist

    Reputations:
    7
    Messages:
    301
    Likes Received:
    40
    Trophy Points:
    41
    I play Project Cars now on the GA w/ a 980TI no problem. I would imagine the 1080 would be better than the 980TI.

    Just my 2 cents.
     
    hmscott likes this.
  10. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    I dont knwo where I got that data from but indeed also up to 10% performance loss.
     
    hmscott likes this.
  11. iunlock

    iunlock 7980XE @ 5.4GHz

    Reputations:
    2,035
    Messages:
    4,533
    Likes Received:
    6,441
    Trophy Points:
    581
    I too have read the GA having a ~10% hit.

    Powered by: Quad Core Exynos + 6820HK
     
    hmscott likes this.
  12. Daniel1983

    Daniel1983 Notebook Evangelist

    Reputations:
    52
    Messages:
    380
    Likes Received:
    187
    Trophy Points:
    56
    I'm using the AGA with an Overclocked Titan X mostly to enjoy Project Cars (I'm an addict) LoL ... Had the same question as you about adding the GTX 1080 to the AGA & limitations that may impose.

    Does anyone know if there is/will be a new (none Alienware) Amplifier(s) that can use the Thunderbolt/USB type-c connector?? Hopefully something like that gets released before the 1080Ti does.

    Video of my setup:


    Got a brand new (maxed spec) AW17R3 on the way; should ship on the 26th, hoping for a decent increase in performance over my 17R2.
     
    orancanoren, zergslayer69 and hmscott like this.
  13. DeeX

    DeeX THz

    Reputations:
    254
    Messages:
    1,710
    Likes Received:
    907
    Trophy Points:
    131
    The performance will be almost exactly the same. Skylake's are more energy efficient but are performing slightly slower actually...
    For example the 4720HQ from the R2 is the haswell equal to the 6700HQ from the R3. The skylake version actually gets a slightly lower benchmark scores then the 4720HQ haswell.

    Also I hope you noticed this but the highest processor on the 17R3 currently is the 6820HK it scores about 9000 in passmark. The processor in your current 17R2 the 4980HQ scores 10065.
    So expect a possible performance decrease. However the 6820HK is unlocked so you can overclock it.

    Now with all that said the 17R2 had a design flaw in the power system and thus had a crippled BIOS and would throttle under conditions that the Alienware 17R3 will not.
    So the 17R3 is far superior.
     
    iunlock and Daniel1983 like this.
  14. TareX

    TareX Notebook Geek

    Reputations:
    0
    Messages:
    81
    Likes Received:
    11
    Trophy Points:
    16
    This is precisely why I went for the Alienware 15r2 with the 6820hk and not the Razer with the 6700hq.... +20% performance (without even overclocking) is nothing to dismiss. I just hope Alienware comes up with a plug and play Amplifier that uses the Thunderbolt port... the extra 8mbps could also be helpful when plugging in external storage into the USB slots.
     
    iunlock likes this.
  15. orancanoren

    orancanoren Notebook Consultant

    Reputations:
    5
    Messages:
    243
    Likes Received:
    42
    Trophy Points:
    41
    Could you share a stock benchmark result with your AGA-Titan X so that we can see the performance comparison between an AGA and a desktop? This whole PCI-e x4 thing makes me uncomfortable about getting a GTX 1070. BTW awesome setup!
     
  16. Daniel1983

    Daniel1983 Notebook Evangelist

    Reputations:
    52
    Messages:
    380
    Likes Received:
    187
    Trophy Points:
    56
    Thanks. I have already shared the AGA Titan X benchmarks on this site a few times. CLICK HERE Page #142, Post #1411.

    Edit: That benchmark is an Overclocked Titan X... I'm not the right person to ask for stock clocks. Don't believe in such a thing. LoL
     
    Last edited: May 22, 2016
    iunlock likes this.
  17. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    Well, the USB ports on the GA don't hamper the bandwidth since it uses direct PCIe connection, whereas they would if the GA used Thunderbolt 3 because the data is shared.
    The performance would pretty much be identical in most cases.
     
  18. Daniel1983

    Daniel1983 Notebook Evangelist

    Reputations:
    52
    Messages:
    380
    Likes Received:
    187
    Trophy Points:
    56
    I don't understand what you mean here... The AGA cable itself carries data at x4 PCIe Gen 3 data rates. That’s about 4GB/s. For comparison, a standard GPU connection on a desktop is a full x16 Gen 3 connection, which carries about 16GB/s... The Thunderbolt 3 carries 40GB/s... That is a HUGE difference, so should perform MUCH better. What exactly do you mean when you say "the data is shared"?
     
  19. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    TB3 doesn't carry 40GBps. It carries 40Gbps (Gbps = Gigabit per second; GBps = Gigabyte per second), which is just double of TB2. Essentially, it runs at PCIe 3.0 x4, just like the GA.
    When I mean the data is shared, the USB ports and the GPU share the bandwidth of the TB3 connection, and that's not the case with the GA.
     
  20. Daniel1983

    Daniel1983 Notebook Evangelist

    Reputations:
    52
    Messages:
    380
    Likes Received:
    187
    Trophy Points:
    56
    That is incredibly stupid if it actually shares the data like that. WoW. All these new machines are only good for school work & 1080p, we will never be able to achieve proper 4K gaming here.
     
  21. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    Wait, sorry. I got mis-remembered the whole "data is shared" portion for the additional reduction of performance. You can find a bit more (correct info) here.
    The GBps part, though, is still correct.
     
  22. Daniel1983

    Daniel1983 Notebook Evangelist

    Reputations:
    52
    Messages:
    380
    Likes Received:
    187
    Trophy Points:
    56
    Thank-you. That brings much more clarity to things.

    Basically, the Gaming Laptops are only good for VR, but don't even think about 4K, which makes the 4K screen on gaming laptops next to useless for gaming; although the screens are of an overall higher quality and you can always play at 1080p on them.. So they do a decent job on 4K media playback & 4K video editing.

    Personally, I'm going to wait until GTX 1080Ti drops, theoretically two of those overclocked in a water-cooled desktop rig should definitely be able to handle all games on maxed out settings @4K, 60++FPS. That's the Project Cars gameplay dream I'm chasing on a 55" curved 4K OLED (*fingers crossed for G-Sync*) display. Looks like we still have another 12+ months to go.
     
    Last edited: May 22, 2016