Both the port and the cable need to have 16 lanes for the PCIe connection to be x16. This would require a redesign (slight or heavy) for both the laptops, X51's back, and GA.
An adapter won't fix things as, well, you'll still be limited to 4 lanes.
-
Being Pcie 3.0 @ 4x, it doesn't really seem to hurt performance even on high end cards from what I see. I am on a GTX 980ti and it performs on par with what is normal for the card given what benchmark or game or whatever it is being used for. I guess what i'm saying is that it doesn't seem like the limited pcie lanes limit the performance much if at all even on high end cards.
-
I heard there's upto 40-50% difference in performance between GA and actually desktop counterpart. That's a lot. You will have variation upto 80 fps. That's a lot of money wasted on a high end desktop gpu. Or am i getting it wrong?
-
Remember that the CPU an plays a role in total game performance. The quad-core i7 CPUs in the 15 and 17 laptops should be sufficient for the GPUs. The dual core CPUs in the 13 and 15 R1 (i5-4210h) will struggle. Notebookcheck did a brief on the GA with the 13 R1 ( i5-4210u and i7-5500u) and 15 R1 (although they are a bit dated). You can also read my GA performance analysis by hitting the link in my signature.Last edited: Nov 28, 2015 -
I would imagine this may become a bigger issue with more powerful cards, but I highly doubt it will make a hige difference due to the way data transfer happens. It's only going to affect perfromance at times where performance saturates available bandwidth. This doesn't impose an upper limit on performance as much as it will reduce available performance at the most demanding times. -
Hey guys. I got my Alienware 17 r3 as soon as it came out in oct. I just got the GA and a pny Gtx Titan. I gotta say it did not live up to my expectations. I think my built in 980m did just as good at getting fps in games like black ops 3. Does anyone else have this problem? Is it really because of the cable or is it just because the difference between a 980m and a Gtx Titan are not that big?
Thanks -
-
-
I believe it is the Titan x. The black one with 12 gb of mem
-
Then something isn't right but it isn't the cable. It sounds like your games are still running off the 980M and not the Titan x, probably a driver issue.
-
http://imgur.com/AEK4cLe
this is my setup, I might be wrong about it being a titan x. But the laptop recognizes it as titan x and the drivers say they are up to date. maybe I just seriously overestimated its power? It just doesnt run as fast as i thought it would. would overclocking help? it might also be the game black ops 3 which is the only one i have tested it on so far. -
Once you do this, run a couple of benchmarks both on your 980m and the titan and post them here.
I'd also like to see some GPUz captures if you can, so we can see the voltage and temps of the titan in-game.
Lots of things could be wrong, and something almost certainly is. -
I am not too familiar with benchmarking but I would love to try to see what exactly is going on. So I guess ill dl GPUz, anything else I should DL or use? would overclocking help at all?
Thanks everyone -
-
I am mostly annoyed right now that I spent a small fortune buying the GA and a Titan x hoping that it would perform much much better. It doesnt seem like it was worth the money over the 980m already in the laptop. hopefully its just the game thats the problem though
-
What resolution are you running at?
A titan is not going to stretch it's legs unless you're pushing it. If you can run some benchmarks and post back, we can figure out where your issue is.rinneh likes this. -
http://gpuz.techpowerup.com/15/12/02/c66.png
so this is what gpuz spits out when i run MGS5 at 4k rez and everything on extra high except for the post processing which is off.
i get a steady 23 fps. is that normal? I saw a video of a guy getting similar fps but with 8k rez. -
http://gpuz.techpowerup.com/15/12/02/9rv.png
lowering rez to 2048x1152 gets 33 fps. even typing this is laggy as the game is running in the background
I believe some things like the gpu load are dropping to 0% when I alt tab out. -
http://gpuz.techpowerup.com/15/12/02/4yk.png
one last cap of a rez of 1980x1200 and fps of 34ish
I am trying fallout 4 now, and amazingly, with everything turned up on ultra (except for low on godrays and medium on shadow distance) as well as TAA best antialias and 16 samples +max view distance. I am getting a solid 60 fps http://gpuz.techpowerup.com/15/12/02/atk.png
sorry this is my first time doing this. I changed gpuz to show highest readings. That should be easier to analyze? http://gpuz.techpowerup.com/15/12/02/uzg.png
Overall I feel like the dedicated memory usage is low when the titan x is supposed to have 12gb but is only using 3-4gb. Or is that normal? -
http://gpuz.techpowerup.com/15/12/02/upt.png This was Metal Gear Solid 5 again with geforce optimization, which had everything on extra but rez at 1980x1200. still stuck at around 35 fps, is it just the games or what haha
-
The memory usage is probably normal. Very few games will come close to use titans full compliment (it's not really designed for gaming)
The framerates definitely seem very low. What CPU are you running? It looks like it could be a CPU bottleneck.
Can you also check in GPUz on the main screen that it's reporting the connection as PCIe 3.0?
Some models of AW are stuck on 2.0 if you don't have a BIOS update. -
http://imgur.com/uCHkEgS
this is the stats on almost everything. i do see that 2.0. is that really a problem? how can i fix that -
As well, uninstall your current Nvidia driver with DDU and install the latest from Nvidia (or a stable one at least). -
thank you guys so much, that bios update is ridiculous, my settings now show pcie 3.0 X16 @ x8 3.0 instead of pcie 2.0 X16 @ x8 2.0.
hopefully that makes a huge difference? currently updating ACC, but kind of worried about uninstalling drivers. it already has the latest one according to geforce experience -
update: my settings went down again to pcie 3.0 X16 @ x4 3.0 instead of pcie 3.0 X16 @ x8 3.0. i dont know why or how, or even if that x4 instead of x8 makes a difference?
metal gear with all high settings and a bit higher rez that was getting about 32 fps is now solid 53 -
Now it won't be bottlnecking, play with your settings as you may find you can increase them further without impacting the framerate too much, or lowering them will show a bigger performance increase. -
The mobile GPU, though, can use 8 lanes. -
Thank you all again for your help. I seem to be getting better fps results. Still not as good as I had hoped but much better. I think others can learn from this. Update your Bios!
-
So i finally got my AGA, set it up and have a fairly annoying issue.
I cant turn off the laptop display.
The gtx 980 is enabled and punping out to my 1440p monitor, but nvidia control panel only sees that monitor. The laptop display is still on and still the primary display.
The intel driver sees only the laptop display and not the external.
Surely im nt stuck with using the laptop display as my primary?
I need to keep the laptop open due to space constraints... -
-
Select "Show desktop only on 2".
Done. The laptop's display is turned off.
The GPU in the GA does not control any monitor ran by the laptop/iGPU, only monitors connected to it. -
So Alienware has some charts to share. It involves the Alienware 17 R2 and the GTX 980 (and GTX 980m if you wondering).
( ).Attached Files:
-
alienware-graphics-amplifier-performance-chart-3d-mark_caldera (1).jpg
- File size:
- 75 KB
- Views:
- 4,092
Last edited: Dec 2, 2015 -
-
Becuase each one is seeing a different screen, I can't even rearrange the monitors to set one as primary. -
Of course the Intel driver doesn't see the other monitor. The iGPU isn't powering it; the GTX 980 is, and vice versa for the laptop's display.
However, Windows can arrange the displays to your liking and set which one should be the primary one.
Don't do any monitor assigning or rearranging through the Intel Graphics or Nvidia Control Panel unless you have connected more than 1 monitor to each respective source.
Wait, now that I think about that part, I would think things would get messy. However, just use Windows's display manager. You should use the individual graphics control panels for configuring each monitor settings and such.Last edited: Dec 3, 2015 -
Yeah, that did it.
I had thought you were talking about going into the iGPU driver. Years of conditioning to never use windows display manager for any reason, I totally forgot it even existed. -
cruisin5268d Notebook Evangelist
Hey folks,
Just got my AGA ($142, Dell MPP!) for my 17 R3. I'm a bit torn on which GPU to get - a 980 /980 ti now or get a 970 now and roll with th at until the new Nvidias GPUs come out. If they are as good as they appear to be it might be worth the wait - but I if I drop enough money for a 980 or 980 ti then I'd probably wait until prices drop on the new Nvidia line before making the upgrade.
Ideally I want my monitor setup to be 3 g-sync monitors - currently I have two ASUS ROG Swifts - but they are the original gen. I got a killer deal for $540/each but like a fool didn't realize they weren't the new series. I was thinking about returning both of them to get one of the new ones (man, they are bloody expensive) and get a second new Swift once prices start to come down. I'm also open to a curved screen monitor but I have yet to find a g-sync model nor one aimed at gamers.
So,
1) Which GPU should I get? 970, 980, or 980 ti? Regardless of the series, any advice on the specific card would be appreciated as I've see lots of conflicting info as to which cards fit and which don't
2) Are the new Asus Swifts worth the extra money? Should I trade in my two for one of them or just roll with what I have and add a third one down the road?
3) Curved screen. Thoughts?
Thanks in advance folks. I've been out of the gaming world for many years and am playing catch up. -
I have a AW17R2 with 980M. I went with the 980Ti in my AGA. The 970 would've been about the same speed as the 980M.
I have the EVGA 980Ti (not overclocked). I get around 12.5k in Firestrike and have no issues gaming in anything on Ultra at 2540x1440. (GTA V, Fallout 4, World of Warships, Project:CARS)
G-Sync is worth it IMHO. Curved screen is a gimmick.
Initially many, many headaches with the 980Ti and switching, but the last few NVidia drivers have fixed it. Switching between laptop/AGA now work with no issues. -
Yeah right now it's not worth putting a card in there that's less than a Ti. Today's mobile GPUs are amazingly powerful.
-
cruisin5268d Notebook Evangelist
My thought with the 970 is it would give me multi-g-sync monitor support and I'd run that until the new NVidia cards come out. I'm not crazy about this idea -but don't want to throw money down the drain now on a 980 ti just to end up buying new card in 6-12 months. In actuality I'd end up keeping the TI for much longer.
From what I see, it looks like the 980 ti can drive 3x 1440p screens without difficulty at high settings. I only have two now...so I'm thinking it should be able to drive 2 1440p screens maxed out on current games. Am I right or am I asking too much?
And yes, I agree that curved screens are mostly gimmick but the idea of 3 of them wrapping around me is very groovy. I'm a bit OCD sometimes and having 3 flat screens awkwardly angled around my head is just....awkward. -
cruisin5268d Notebook Evangelist
If I go to 980ti route....which specific models fit? I'd like to stay away from the reference cards.
I wonder if the EVGA card with the included water cooler would work and be long enough to replace the AGA fan. That would avoid me needing to splice wires to replace the noisy fan it came with.
As I understand it for the Nvidia cards....Asus > MSI > Gigabyte > EVGA > Zotac. Is that order of overall quality + support + performance correct? -
kakashisensei Notebook Consultant
EVGA has the best support/warranty. Zotac probably has the best value for its AMP! and AMP! Extreme versions. MSI Lightning I heard is one of the best top end versions. EVGA Kingpin and Classified are also supposed to be very good, but significantly more expensive. In terms of cooling, there isn't that significant of a difference as long as its not the stock nvidia blower style cooler. All the custom coolers have lots of heatpipes and multiple fans. You can adjust fan profiles/ loudness via 3rd party software. For coil whine, its random depending on the specific card sample and I'd expect it with the stock GA power supply.
If you want to get one that fits without modding the case, you will have to sacrifice your options. Pretty much any version that has extra card height (typically for better voltage/power regulation) or bigger heatsinks won't fit. Heatsinks that take up more than 2 slots definitely won't fit.
Based on what I've seen, these are the cards that will fit without issue:
Any stock 980TI with the nvidia blower style cooler
Any EVGA 980TI with the ACX cooler (hybrid coolers you have to mod the case)
Possibly the Gigabyte G1 -
cruisin5268d Notebook Evangelist
Every mentions modding the case. What type of work are we talking about? It was only $142 so I'm not super worried about damaging it - heck, its a piece of junk anyways. I have to literally yank the lid to get it to open.
I want to stay away from the stock ones for some of the reasons you mentioned - namely performance. As far as the ACX cooler you mentioned - does that include the newer ACX 2.0? I did some research on that and it seems they are cooler and quieter.
I looked at the G1. Looks pretty solid and has great reviews. And, by that I mean, it looks like its a beastly card after watching some YouTube videos. -
Nothing wrong with the "reference" cards from Nvidia, the blower style works GREAT to get heat out of the case. I have been in this game a long time and have had MANY more issues with non reference cards then the reference ones. And as far as overclocking goes the reference cards generally do just fine, for example my reference 980ti gets close to 1500mhz boost on the core with 110% power limit while staying around 70C with 70% fan speed gaming with heavy GPU usage. Just my opinion of course.
-
GTX980 Alienware Graphics Amplifier FPS Test!(The Witcher 3) 4K
-
That was a nice bump up with the GA. How did it run better in 1080 ultra compared to medium though?
-
kakashisensei Notebook Consultant
Do note that pretty much all power supplies will have issues fitting in the graphics amplifier. They do not have matching power plug location and most high end ones are too long and prevents the lid closing. I had to use a SFX power supply and its not mounted, but it fits inside.
For EVGA, the ACX/ACX 2.0 coolers will fit with no issues. Note the Classified and Kingpin are not ACX coolers.cruisin5268d likes this. -
kakashisensei Notebook Consultant
-
cruisin5268d Notebook Evangelist
For the Power Supply, do you remember the specific model? If that helps then I definitely see replacing it in my future.
For the MSRP of $300 this should have included a decent power supply capable of producing clean and regulated power and the cases should have had some sort of quality control so they don't require high levels of force to open. And, last but not least, Dellienware should have made it a bit taller. Huge oversight on their part for designing this to not allow the most common models its customers would want to use. -
I might consider trying the AGA with my New Alienware 17 R3. Have they fixed the bugs and issues?
-
Anyone get "windows 10 boot into black screen with mouse active"?
Sent from my D5803 using Tapatalk
*OFFICIAL* Alienware "Graphics Amplifier" Owner's Lounge and Benchmark Thread (All 13, 15 and 17)
Discussion in '2015+ Alienware 13 / 15 / 17' started by Mr. Fox, Dec 10, 2014.