I'll run firestrike tomorrow on my desktop with my 970 strix in single card performance and grab the graphics number to directly compare to the laptop + aga number. The performance loss, if I remember correctly, is about 10% in theory. So where I saw 13498 in the graphics score with the AGA, my actual desktop build will probably be closer to 14,600-14,800 on the graphics score.
My CPU, when the AGA is attached, runs at a solid 3.61ghz on all four cores. And will run at 3.8ghz on single core benchmarking. My 4710HQ is exceptional though, I have not, at this time, had any thermal throttle issues. The direct comparison would be 10942 in AGA and 9151 on the laptop on its own (980m) for firestrike.
-
reptileexperts Notebook Consultant
-
reptileexperts Notebook Consultant
Here is a screen of my XTU for the 4710HQ when the graphics are handled by the AGA - it runs flawless for me, but it may not for you.
Attached Files:
-
-
kakashisensei Notebook Consultant
3. The default power brick is 180W, which is fine for 970m, but inadequate for 980m. With a 240W power brick and bios A00, 980m won't throttle. Later bios use the battery to supply additional power and does not utilize the extra wattage from the 240W brick, which does not stem throttling.
4. There are various reasons for the performance loss. Not all may apply to the game/application. One is there is a 32Gbps max bandwidth on the AGA connection. That is only pci-e 3.0 x4 link speeds. That is the point where some applications may begin to saturate that bandwidth, but the performance loss is still small. That is CPU feeding data to the AGA GPU, and potentially AGA GPU sending frame data to the intel iGPU if you use the laptop screen. Ideally, a pci-e 3.0x8 would be more than enough. The pcie extra lanes are there in the cpu, but Alienware decided not to add them to the setup.
Furthermore when using laptop screen, the frame output has to be sent back to the laptop across that connection. That will consume some of the limited bandwidth. Second, the laptop uses optimus. That means the intel gpu is taking the frame buffer from the AGA gpu, and the intel gpu is still generating the final frame to be sent to the laptop screen. There can be inherent inefficiencies to this. When the external monitor is connected to the AGA directly, there is no need to send frame output back across the AGA connection, and it is not using optimus.Last edited: Aug 12, 2015 -
reptileexperts Notebook Consultant
@ orancanoren
http://www.3dmark.com/3dm/8162239?
OK matched my laptop processor specs with my desktop build 4790k and plugged in the 970 I used for my Alienware GA. So comparing it more or less apples to oranges (using same monitor and all that jazz) its very very close
Alienware with AGA 970 cpu @ 3.757ghz - score 10942
Desktop custom build with 970 and CPU underclocked @ 3.757ghz - Score 11140
http://www.3dmark.com/compare/fs/5545592/fs/5692759 final comparison
Note: The desktop run kept the processor clocked slightly higher, but the GPU OC was not exactly the same (the laptop was OC'd with NVidia inspector, this was OC'd on the desktop with my asus GPU Tweak. for full disclosure. -
Received my AGA and EVGA GTX 980Ti today.
After 4 hours trying myself and with Alienware Tech Support, I have given up trying to get this thing working.
Windows 10. Ran DDU before install, installed 353.62 initially and got the 'exclamation point' in Device Manager no matter what. Ran DDU again, installed 353.30 and the 980Ti was recognized, but it refused to use it.
Everything I ran used only the iGPU. Yes, I selected (high performance) in the NVidia control panel.
I was escalated to an 'engineer' at Alienware and was told there is a 'hotfix' driver coming tomorrow for the 980Ti. I have a call-back scheduled at 5pm EDT.
Any suggestions in the meantime to get the AGA working? -
reptileexperts Notebook Consultant
Put a different card in lol. There have been known issues revolving around the 980ti compatibility. If they have a hot fix for the drivers tomorrow,all you can do is wait. There have been users on here who have been able to get the 980ti to work but I believe it was on 8.1 and I also believe it required a lot of disconnecting and reconnecting.
Sent from my iPhone using Tapatalk -
Not terribly helpful.
Don't have another $650 videocard to swap for another $650 card.
At this level of investment in the Alienware brand (laptop + AGA), you would expect them to get it right. -
reptileexperts Notebook Consultant
No help that can be given I'm afraid. Use the search function in google directed at this site and you should be able to find the page where someone described how they managed to get the 980ti going. But as for as official support goes it's non existent at this time. Even if you bought the 980ti directly from Dell with the amplifier. But fortunately the hot fix is coming.
Sent from my iPhone using Tapatalk -
AGA + 980ti
The issue is ongoing. 980ti works fine with win 8.1 if you install drivers just for the 980ti.
With windows 10, the 980ti still does not work. Someone found a temporary fix, which is to wipe everything with DDU, disable windows update, download and isntall the older win 10 drivers the 352.30. Unfortunately that fixes the issue for approximately 15 minutes then windows still brings you back to the integrated card.
So as of now there is no fix, dell escalated the issue to nvidia. And everyone is paitiently waiting for a fix. Welcome to the club -
Hey there... I'm new on the forum although I've been reading this thread for a while.
I got my AGA with an EVGA 980ti SC+ today and I already got it running flawlessly. I don't know why, but the 980ti was recognized with the 353.30 without problems.
I'm running an AW15 on Windows 10 with an external display and holy sh.t this thing is on fire now!!!
For the installation I did following steps:
1. Update the Graphics Amplifier Software if you're using Windows 10
2. Plug 1 monitor to the AGA.
3. Uninstall whatever you currently have with DDU
4. Connect the AGA and restart
5. Install the MOBILE driver for the 980m 353.30
6. Restart
So right now I'm running Witcher 3 in 1960x1200 all maxed out with stable 80-90FPS (locked to 60 and synced while playing)
http://www.3dmark.com/3dm11/10169490
3DMark Score: 16621
Graphics Score: 25187
Physics Score: 8218
Combined Score: 8241
I'm pretty satisfied -
That's not true. I posted the fix and it's been working fine for about 2 weeks now. Here's what i did.
1. uninstall driver using DDU
2. install 353.30 windows 10 driver
3. use this tool to prevent the nvidia update https://support.microsoft.com/en-us/kb/3073930?utm_source=twitter
4. disable auto updates in windows 10 with these instructions, Open the Control Panel by right-clicking the Start button and selecting Control Panel. Navigate to System and Security > System > Advanced system settings. Click the Hardware tab, click Device Installation Settings, and select the “No, let me choose what to do option. Select “Never install driver software from Windows Update.”
You have to use that tool i linked to in order to prevent the nvidia update. Choose "hide updates" from the download then select the NVIDIA update. Like i said, it's been working flawlessly for me since. I'm running an alienware 15 w/ 970m and a 98ti aga. -
Got mine working today with the new 355.60 (8/13/15) driver.
Alienware still has no official fix, so YMMV. I have mine working with an external monitor.
Firestrike w/980Ti: 11725
Firestrike w/980m: 8405
http://www.3dmark.com/fs/5705634 -
Hey guys, I read the first 21 pages of this; and realized it is taking too long to read everything LoL.
I have an Alienware 17R2 on the way (currently in production): 4980HQ CPU, 980M, 16GB RAM, 512GB M.2 Boot + 1TB 7,200RPM storage, Touch Screen, Win 8.1 Pro, Amplifier with 980GTX VC. Just ordered the 240W Power supply from Dell (in case I need it).
My Questions:
For those of you with similar specs as mine, how are things working for you using the latest drivers & BIOS version? Does everything work well, without issues in terms of Amplifier performing as expected?
Have any of you upgraded to Windows 10 yet, if so have you had any issues since the Win 10 upgrade? Do you recommend the upgrade, or hold off for a while.?.
I also have some spare Corsair Vengeance RAM, wondering if this RAM would work so I can swap out the one it comes with??
I am strongly considering buying an EVGA Titan X Hybrid to use inside the Amplifier so I can disconnect the loud fan many of you complain about & still maintain very cool temps on overclocks. Would installing the Titan X Hybrid be an issue; or would it be a fairly straight forward installation?
If anyone has any suggestions for a better GPU to use inside the AMP-please share your insight.
Thanks a lot in advance. Any tips are greatly appreciated.Last edited: Aug 13, 2015 -
I have a AW17 R2 with almost exactly your same specs. I am running Windows 10 and I replaced the M.2 and HDD with 850 EVO's.
I have the Amplifier with a EVGA 980Ti. It doesn't work out of the box. You have a chance of getting it to work with the latest 355.60 driver, but it seems to be hit or miss. Seems to help if you use an external monitor.
Alienware knows it doesn't work and says a driver is 'forthcoming', but there is nothing available for now.
BTW - you only have 2 slots for RAM on the motherboard, so that RAM won't get you anything (since it's 2 kits of 2x8).
A05 BIOS is fine if you don't plan on overclocking. It may solve some problems with the Graphics Amplifier.
Windows 10 upgrade has two issues: the Killer wireless driver is flaky (the Dell supplied one) and the Dell supplied driver for the Sound Blaster Recon3D disables the jacks on the side. There are updates available for both from vendor websites.
Good luck. -
I know there are issues with the 980Ti (unfortunately) but I plan to run a Titan X which as far as I know works just fine.
Would I have better overall stability if I don't upgrade to Win 10, or am I ok to upgrade?
Also would the Titan X Hybrid fit just fine inside the Amp without any difficult installation/mods? How are those of you running Titan X's in the amp finding the performance overall? Is there plenty of power in the Amp to achieve reasonably decent overclocks?? -
Yes, your 2133 RAM is slightly better than the OEM. It seemed from your picture you wanted to upgrade to 32GB of RAM.
Titan should be fine. There isn't really a place to mount the Hybrid's fan in the case unless you can replace the front fan with the Hybrid. The AGA fan is an 80mm.
Win 10 has been pleasantly stable after getting the drivers up to current levels. The only one causing BSOD, was the Killer driver. The new one is stable for me. -
-
the power of the amp is fine for overclocking. In witcher 3 my ti boosts to 1460MHz constantly.
Raid 0 is at least currently not possible. But that should not affect framerates anyway.Daniel1983 likes this. -
-
http://www.3dmark.com/fs/5699642
i got the fans at 80% constantly, and in the room are currently approx. 40 degrees... Highest temperature of the ti were 68-69 degrees.
The firestrike score mentioned above wasnt with my final overclock so i'm pretty confident that some more points should be possible. -
-
it is
and the 4980hq should ne capable of getting even higher overall scores.Daniel1983 likes this. -
P.S. Does the 17R2 have msata? -
I'm on an AW15 with an 4710hq. I overclocked it with XTU to 3.7GHz but when i'm running benchmarks I'm hitting power limitation throttling... Maybe a lower OC that runs constantly without throttling would even be better.
But the CPU never throttled while gaming (3.6GHz constantly).Daniel1983 likes this. -
reptileexperts Notebook Consultant
On your ram question. Your motherboard only has the ability to utilize 1600mhz ram. While your timing is likely faster on the 2133 when it is foe clocked to 1600 the timing will change as well and likely match what is currently in your system. Sell the ram and recover some funds from it.
Glad to see the 980ti working through the motions. My AGA is now simply used for flashing cards and random benchmarking as I decided to do a full desktop build to go with my 17r2.
Cheers
Sent from my iPhone using Tapatalk -
-
reptileexperts Notebook Consultant
Daniel1983 likes this. -
kakashisensei Notebook Consultant
Anyone know if the new 355.60 nvidia driver finally works for swapping between the 980 ti / 970 or 980m without having to uninstall/reinstall drivers.
-
Does the AGA need to be connected at all times to my Alienware 15? Or is it more-or-less a plug and play setup?
1. Will I need to install and uninstall drivers every time I disconnect it and bring the laptop with me?
2. Can my wireless mouse still be connected to the AW15 or does it need to be connected to the AGA?
ThanksLast edited: Aug 14, 2015 -
-
-
Have had the chance now to do some poking around with the card and AGA.
Uniengine Heaven: 1600x900 8xAA windowed: 2626 (stock clocks)
With +50mem and +10core: 2784
Uniengine Heaven: 2560x1440 8xAA Full Screen: 1570 (stock clocks)
With +50mem and +10core: 1655
Firestrike (OC): 12331
Firestrike (stock): 11725
80C was the limit for both runs. Looks like there is a lot more headroom when I decide to push. 74C for stock, 77C for OC.
BTW - still no resolution for the connected/disconnected issue. If you ever pull the AGA and then reboot the laptop, you have to re-install the drivers to get the 980m to work.Last edited: Aug 14, 2015 -
Do 3rd party cards with aftermarket coolers fit inside the AGA fine?
-
reptileexperts Notebook Consultant
The ASUS strix fits just fine with its large fans and DU cooler. Can't attest to the cards with three fans and such. Most prefer the reference cards however.
Sent from my iPhone using Tapatalk -
It depends on the size of the cooling system. Dual slot cards should fit, but they almost always have different dimensions... The Evga does fit, I can confirm that but I'm pretty sure the Zotac Amp Extreme 980ti wont fit as it is a 3 slot design.
-
I'm looking at 980tis.
So i should stick to dual fan types? EVGA isn't available in my country, the only dual fan I can find is the MSI Gaming G.
The rest are Triple (Gigabyte, Inno3d, galax, strix)
Thoughts? -
the strix should be double slot, but i'm not quite sure. I was thinking about trying the strix as well but but i chose the evga because of the better availability. Reptileexperts mentioned a few posts before that it fits but I'm not sure if he was talking about the 980ti.
-
Hey guys, my Alienware 13 is on its way and I'd like to know if getting the GA later on is a viable way of extending the lifespan of the laptop. Thing is, I need it to last 6 years, and ideally I'd use it as a desktop replacement at home and a regular laptop at uni. So, on paper, the GA is exactly what I'd need...
BUT! Reviews all said the ULV i7 CPU is a bottleneck severely limiting the GA? Now, I'm not a heavy gamer, I'm used to ultra-low settings and choppy framerates, but I'd really like to game on the A13 till about 2020 using medium settings. It does cost almost 4 times as much as my current machine...
If the CPU is already a problem today it probably will be a big problem later on. Since the GTX 960 performance already exceeds my expectations, I'll be fine without the GA for a while, but imo the only reason for an external GPU is actually so that the GPU can be upgraded later on. But is there a point in upgrading the GPU if the CPU can't keep up?
So what's the verdict? Should I forget about the GA and gaming on a supposedly gamer laptop? Or are the GA's detractors simply jealous, and finally I'll have a laptop that won't grow obsolete in a blink of an eye?
PS: Bonus question for brownie points: My current laptop and the standard I'm used to in a gamer laptop is an ASUS K52JT bought in 2010. How much of a performance bump can I expect? In layman's terms, will the A13 blow my mind? -
There is a bottleneck when you use the ULV i7 - that's something Skylake ULVs should fix up (soon). If you want it to stay relevant until 2020, you might consider waiting for the Alienware 13 R2, or getting an Alienware 15 for future-proofing. That said, the bottleneck is relative: for a hardcore gamer it might matter because he likes max settings at 60fps on his shiny new 4K GSync monitor, and almost certainly dropped several hundreds on a dedicated graphics card too: so it's up to you to decide how "safe" is safe.
As for the i7 5500u CPU, its weakness is only having dual cores, and most games will be optimized for more cores going forward (some even have checks blocking dual cores, like Dragon Age Inquisition). However, thanks to the Alienware's chassis, no thermal throttling issues that hinder non-gaming laptops = 3.0GHz for both cores (assuming you don't get a lemon). Performance should be quite good for gaming.
Relative to the ASUS K52JT:
You're going from a gen-1 Nehalem CPU to a gen-5 Broadwell CPU. Although the i7 5500u is a ULV (which means trading off performance for more battery life), even the gen-1 i7 model should lose to it (and it does, though not by much). Imho, you shouldn't have to worry about it, as CPU improvement won't be as drastic 5 years from now (as opposed to 5 years prior) since the scaling from Moore's law is moot.
The HD 6370m (Class 4 GPU) sits somewhere between gen-3 Ivy Bridge and gen-4 iGPU performance so... it's the main culprit behind needing to live with "ultra-low settings and choppy framerates", whereas the GTX960m (Class 1 GPU) is capable of running all modern games at 1080p, as long as it's not taxed too heavily. It's always beneficial to have a faster GPU, since more things are being offloaded to it all the time - it wasn't too long ago that all GPUs did was draw on the screen, now they can compute OpenCL, handle PhysX and even perform raycasting! Interesting times.
There's only one real competitor to the GA right now, and that's MSI's GS30 Shadow dock. It just feels... flimsy, and the docking approach is idiotic. With the GA, you don't need a new monitor, keyboard and mouse just to boost performance. That's pretty much what made the decision for me, especially when you consider that PCIe x4 vs x16 isn't all that big a deal (few fps).
In layman's terms... yes, just try it! (and if you don't like it, 21 or so days to return).Last edited: Aug 15, 2015PolyHC likes this. -
Thank you SO MUCH! Finally someone speaks plainly about the limitations and translates the meaningless jargon of benchmarks and stuff! Concise, clear, exhaustive reply, I couldn't have hoped for more!
Bottleneck issue suddenly doesn't seem too badOne question though: Suppose I make use of the GA starting 2018-ish. will it trash the CPU entirely, due to the bottleneck? Run it near 100% for too long and just exhaust it over time? Or does the bottleneck only mean that less of the potential of the desktop GPU is used? Say, 70% of what it's capable of? That would be the ideal case, cause I have never ever wanted to play anything at max settings, medium is the highest I ever aspired to(Raised on budget PCs, it's sort of ingrained now
) - and you can do medium easily with 70% of a decent graphics card.
Sadly I can't afford to wait, I NEED to have a laptop by mid-September at the latest. My purchase is pretty last-minute as it is.
I looked at the MSI ShadowDock, but I never considered it - the price of the A13 is already stretching my budget to its limits (As far as laptops and especially gaming ones go I found it best to go for the most expensive option that's still affordable - the more expensive it is, the longer it lasts) I didn't even consider that on top of the dock itself I'd need to splurge on so many things!
Thanks again, and have a great day! -
No problems! I've been through it before, so I know what it's like wading through data without gaining much insight.
As for your question, it's kind of hard to tell the future. What I can say is, the CPU is designed to run 100%, all the time (never exhausted!) - therefore, your latter guess is correct: it only means less usage of the desktop GPU. However, DirectX 12 should only improve it even more, since it reduces the time spent waiting around by the CPU (more on that at the original MSDN page). While the difference wouldn't be as large as a quad core, a dual core would still benefit from shaving off time on the second CPU core.
Looking into my crystal ball, I would say even high shouldn't be a problem. Recall that Moore's law is dead; Intel has been stuck on the 5GHz barrier for ages (in fact, the promise to surpass it was quietly shelved, and instead you got the whole concept of dual & quad cores). Right now, it's less about raw performance gains, and simply improving efficiency and power savings. That's what you're mainly missing out by not waiting.
Although, considering that the i7 5500U isn't exactly a beast, you should aim for the sweet spot instead, somewhere between GTX950 and GTX960 (or their Ti variants). This way, it wouldn't hurt your wallet, and you can keep upgrading affordably into the future. On the flip side, if you bought a flagship 980Ti now, it should last longer into the future, but you would be stuck with its current features (so no HBM GPU for you, unless you chip in a few hundred more $$$).
If anything, the GS30 Shadow competes more with the 17R2, since they are top-tier CPU-wise (however, GS30 has problems when stressing its CPU). -
Is there a longer cord available anywhere? The PCI cord that is.
-
I spent some time on the weekend pushing the AW15 with the 980 ti:
Firestrike: 14540 Overall
3DMark11 Performance: 17144 Overall
It's funny how it works though it's not supported... I can use the 980ti only for external displays. When there is no ext. display connected although the GA is, the 980ti is still recognized, but not activated. So all games run on the iGPU. When I connect the monitor it works again as it should...
@JDawggS316 : Sry, I don't know about longer cords. Try asking the Dell Support... but I wouldn't bet that is a longer cord available. -
So no one here is running a Hybrid GPU in their Amplifier?
-
-
So I got my Alienware 17R2, loaded Win 10 on it. Waiting for the Amplifier to arrive and the Zotac Titan X I ordered. Benchmarks coming soon.
For those of you on the 4980HQ CPU, how is it keeping up with the GA?
Curious to see some of your Titan X overclocking profiles, I'd like to have a general overclocking point of reference to work with. Firestrike scores with OC profiles would be greatly appreciated....
Do I need a Beta driver to run the Titan X in the GA, or is Nvidia's latest driver good enough?
ThanksLast edited: Aug 20, 2015 -
When idle the cpu drops at 800Mhz unless I change the power option to high performance, then it keeps running at 3.5Ghz without GA and 4Ghz with GA (as connecting the GA automatically enables the OC)
P.S. There is a bug in the A05 bios, by default the Overclocking feature is set on with the GA but the OC itself is off, this actually disables the turbo boost ! You either need to set the OC to lv1 or disable the overclocking feature entirely in the bios (that's if you are not using a tool like XTU)Daniel1983 likes this. -
I would advise against getting the Alienware 13 though, The ULV processor along with the pcie 2.0 x4 connection to the GA makes this a bottleneck for any high performance desktop GPUs. -
*OFFICIAL* Alienware "Graphics Amplifier" Owner's Lounge and Benchmark Thread (All 13, 15 and 17)
Discussion in '2015+ Alienware 13 / 15 / 17' started by Mr. Fox, Dec 10, 2014.