I have not tried that, but the latest version of DDU has a 'disable Windows automatic driver update' setting and I have used that since July.
-
-
-
-
The bigger problem would be with bandwidth, as I think the PCIe lanes used are the same ones the internal GPU uses?
The SLI problem you refer to is "microstutter" or frame latency, which is something both AMD and Nvidia have worked very strongly to fix in the last year or so. It's now virtually gone, and assuming a good SLI profile and GPU utilization, dual GPUs are pretty much as good as a single one in that regard. -
kakashisensei Notebook Consultant
-
Do you mean GPU score or overall score? Maybe you are mistaking it with your firestrike scores? But 18500 would be too low even for the GPU firestrike score...
If you are comparing your overall scores I'd like to know your XTU settings and your physics and combined scores -
U guys with an Amplifier:
I got my AW17 R3 and i observed something strange... My PCI-E Bandwith is only PCI-E !!!!2.0!!! x4 not PCI-E 3.0!
Does some1 got the same? U can look it up in GPU-Z.
Is this because the AW17 R3 shares its PCI-E Port between GA and PCI-E m.2 SSDs??
If this is true, its just ********... again halving bandwith... -
If anyone has a 13 R2 or 15 R2 and a Graphics Amplifier with a desktop GPU, upload a GPU-Z screenshot as well. I want to see if the PCI-E 2.0 x4 bandwidth is lineup wide. -
Here is mine
AW17 R3
6820HK
PCI-E SSD
I wonder if that bandwidth would go up if we just use the sata port or if we put a sata m.2 ssd in it?!
Edit:
At least it doesnt affect gaming much. My 3DMark Score stayed the same (a bit higher due to better CPU)
http://www.3dmark.com/3dm/8904699?
Edit2:
So i just tried to run the laptop without any SSD in the m.2 slot... still the same bandwidth. So it has to do with the design of the mainboard i think. Nothing we could change in any way.
@Alienware-L_Porras could u let the techs know? Maybe there will be another revision of the mainboard... This is definitely a step backwards
Edit3:
So maybe i was a bit fast. Just asked Frank Azor via Twitter about that Problem. He said last gen has been pcie2.0 too. I cant exactly recall if thats true for quadcore aw15 or aw17.Last edited: Oct 14, 2015 -
x4 @2.0 doesn't seem like it could possibly be enough bandwidth for a high end card? -
This is a physical restriction on the PCI-E port and cable.
(Oh, and remove the @ in front of 2.0 because there's a member called 2.0, and he'll be receiving an alert that you tagged him.)
@raiden87 No, the 15 R1 and 17 R2 was at PCI-E 3.0 x4 when using the GA.
Last edited: Oct 14, 2015 -
x4 3.0 is more reasonable, as would equate to x8 2.0
But I feel the need to check my 15 now, as if its 2.0 x4 it's definitely going back. No way that wouldn't bottleneck a high end card. -
I have no Problem with pci-e 3.0 x4 since there isnt any real world difference... But at 2.0 i am pretty sure we will see some games which run slower with GA (gtx 980) than without (if u got an gtx980m).
@Game7a1 die u took that Screen by urself? i am a bit disappointed nowsince the Skylake CPUs run so much cooler than haswell.. It would have fixed the main issue of prev gen. Now there are new ones... Cant decide which is worse.
-
The average drop from PCI-E 3.0 x8 to PCI-E 2.0 x4 with the same CPU is 13% at 1080p, and the difference drops at higher resolutions. So no, the GTX 980m won't pull ahead in performance compared to the GTX 980 in the GA in some games (especially since the GTX 980m is closer to the GTX 970 than to the GTX 980). However, it does mean that the GPU performance from PCI-E 3.0 x4 is between 8 to 9%, but if the CPU is strong enough, the difference may not be noticeable or present. You can read more about PCI-E scaling on the GTX 980 here, if you haven't already.
It also means that getting a GTX 970 if you have a GTX 980m is ill advised with the new laptops (to be honest, I always thought it wasn't the best idea with the older laptops. It's almost like getting a GTX 960 for the GA when you had the GTX 970m in the laptop or an R9 285 when you have the R9 m295x).
The image came from the review I linked. I don't own a 15 R1 or 17 R2.
I'm more inclined to think that the limitation is partially due to the Thunderbolt 3 port with its PCI-E 2.0 x4 bandwidth (the PCI-E SSDs can also be a factor). And because of the lower bandwidth, the new Alienware laptops have two ways of getting the same eGPU performance, through the GA port or through the TB3 port, if the support is there.Last edited: Oct 15, 2015judal57 likes this. -
Yeah ofc! I dont think there are much games where it will happen but if i Look at the link you posted WoW could be a good candidate for it. The scaling is pretty hard (x8 3.0 vs x4 2.0). I could imagine that the gtx980m performs better in this particular case.
If i read the article and your comments right, the difference will be more if the Game is CPU demanding? -
If all else fails, get an AMD GPU. I hear they have better PCI-E scaling than Nvidia GPUs (if you have a good CPU, of course. If you don't, like in my case, then Nvidia's the only option).
I do hope Alienware finds a way to reverse this downgrade for the new laptops as they all support PCI-E 3.0. Possibly through disabling the TB3 bandwidth or something similar. -
As for the 13 R2 not getting PCI-E 3.0, it seems as if it's getting the BIOS-shafted again. Sucks, but that's what happened with the 13 R1. -
i thought the same thing. But maybe it has something todo with optimus, which always was a pro for eGPUs. I am not that deep into this tech stuff :/ For the AW13 i dont have an answer.. i can just imagine thats because of the thunderbolt port and/or because of the pcie m.2 slots
-
kakashisensei Notebook Consultant
-
This happens with every company, and every product due to greed.
I guess 'test engineering' is a lost field of work. -
Glad to hear there will be a BIOS fix for the PCIe issue. Hope this is very soon.
Frank talking about negligible difference is very disingenuous. It's important to remember that many people expect the AGA to last several generations, it's sort of the whole point.
If it's bottlenecking a high end card now, it will be bottlenecking a mid range card next gen. x4 3.0 is already pushing it, 2.0 would be completely unacceptable. -
-
mh cant really say something to that... in the m17x r times i think they where good at fixing stuff. But if we look at the latest problems with fan tables... i wouldnt say so :/ i hope they do it fast
frank promised it to me
-
can someone confirm if the pcie ssd affects performance of the graphics amplified?
-
They doesnt affect it...
-
I have a Sapphire 390 that is 12" long. I know the official spec is 10.5" however, would it fit anyway? My Corsair 380t case says up to 10.5 however the 12" 390 just fits. Pics: https://goo.gl/photos/ApKDND1CREzX15qU7
Does the 390 in general work fine? Anyone else take a dremel to their graphics amp yet? -
i7 6700hq = Up to 1x16, 2x8, 1x8+2x4
i76500u = 1x4, 2x2, 1x2+2x1 and 4x1 -
Has anyone swapped the PSU out with another? I'm asking for the sake of 'will it work', not because I think a GPU requires more power.
My thoughts are to yank out the 'brain' board and fit it to a much smaller chassis and use an SFX PSU to help keep the size down. -
I don't own a graphics amp yet but as long as you can switch the PSU on, you should be gold. I believe it's green wire to black to kick it on, find a pinoutDark_ likes this. -
Next steps are to come up with a chassis design that will shrink the AGA down yet still support a full length GPU. -
I changed the PSU. Works flawlessly.
You can see the pictures a few pages back. -
kakashisensei Notebook Consultant
-
I still need to comb through the 74 pages but can anyone really quickly answer my question. When you put an AMD card in the graphics amplifier slot, do you need to install the drivers on the notebook? How does this affect things when you go back to the built in gpu? I am assuming that it will automatically detect that you want to run from the external card and apply the AMD drivers you have installed and then when you reboot after unplugging it notices you want the Nvidia drivers to be active and start with those but I just wanted a confirmation from someone here who has tried this and possibly does it on a semi-regular basis. I just snagged an AW17 and got an amplifier on ebay for about $100 and the only card I have in the house that I am not using is an AMD card and just wanted to see how simple it was get it all working. Sadly the card is an R9 270 so its actually slower than the GTX 970m but I figured I'd give it a whirl anyway and upgrade the card in the amplifier down the road seeing as the 970m decent enough for the low amount of gaming I do anyway.
-
I used to have an R9 285 (well, I still have it, but I'm selling it) and using my laptop (13 + GTX 860m) with the GA, swapping between the GTX 860m and R9 285 was seamless and mostly without problems. Didn't have to reinstall drivers, and updating drivers was fine as well.MSGaldenzi likes this. -
-
So I just ran 2 firestrike benchmarks for LOLs and Here is what I got. These are from a base AW 17r2 with an ssd.
6566 - With the 970m
5111 - With the Amplifier and an R9-270 from an xps8700
I am suprised at how much better the 970m is, the fans didn't hardly kick in either. I still think the amplifier is pretty sweet, but I am thinking its going to be a while before I can put a worthy card in there. I may yank the GTX 960 out of my X51r2 and see how it performs, but from what I have seen... I think its going to be about the same as the 970m. (Actually I really should just run firestrike on my Desktop and compare it) -
-
Does anyone know anyone/company that will do some modding to the amplifier in sydney NSW australia? I would like to make as much of the left panel (closest to videocard) perspex to show card, dont want it to cost too much though.
-
So I did something crazy. I did a benchmark while on battery (note: it's the one with the lower score).
http://www.3dmark.com/compare/3dm11/10461373/3dm11/10438282
Aside from the CPU drop, the scores are pretty similar. It's kind of funny, but I do not think it can be replicated in the bigger laptops with i7 quad cores (I supposed the 15 with either the i5-4210h or i5-6300HQ could manage similarly in the sense that the CPU doesn't suffer much). -
I have a graphics amp question I'm curious about that isn't "laptop" related but figured maybe someone here would know. I have one of the new Alienware X51 R3's coming and plan to hook up a amp with a gtx 980ti in it, being this is a micro desktop and only hooked up to a external monitor I'm wondering if the X51's internal gtx 960 needs to stay in the machine being I really will have no use for it as the amp will always be hooked up to it? Just curious is all being its a little different setup then my 17R3.
-
I mean, if you don't plan on ever using it, change the main display GPU in the BIOS to the iGPU (for now), then put the GTX 960 in an anti-static bag, plug in an HDMI cable to the iGPU (top HDMI port), and do whatever you want. -
-
I wouldn't sell it for warranty purposes. But that's just me.
I would play around with the set-up before deciding what is best. -
Note to whoever: don't plug the GA cable to the laptop while the laptop is in hibernation mode. None of the dGPUs will appear when doing so.
Or don't leave the GA cable in while in hibernation if you're using the internal dGPU (this is possible by allowing the laptop to give a prompt to restart or, when desired, have the cable's button function be to restart). -
-
-
-
Gotcha. I was unaware the TI was that much higher consumption over the standard 980. Either way, can't wait to see your results.
-
What are 'slim fit' computers used for? A 'tiny house' perhaps?
*OFFICIAL* Alienware "Graphics Amplifier" Owner's Lounge and Benchmark Thread (All 13, 15 and 17)
Discussion in '2015+ Alienware 13 / 15 / 17' started by Mr. Fox, Dec 10, 2014.