They are good for entertainment centers and as a TV setup. Also for traveling, the X51 fits in larger backpacks... I do assume you were talking about the X51.
-
-
Just picked up a used R9 290 on CL for 165 (reference design Asus). Plopped it in the amp and got 9172... not too shabby.
http://www.3dmark.com/3dm/9150093 -
I am going to buy an AW 15r2 full specs. I was wondering if the GA will be compatible with Oculus Rift, disabling Optimus and connecting it directly to the dGPU. Thank you
-
I think someone from AWA may have successfully used the Rift with the laptop + GA.armymax likes this. -
-
2. That all depends on which desktop GPU you get. -
-
-
About Oculus I know it is just a video output (with all motion sensors) and it has a 2160x1200 resolution at 90Hz. -
-
I have officially been ruined by high hz panels, went back to using my 17r2 today, and everything feels so choppy when I move it, damn you rog swift and gt72 with 75hz
-
-
-
I just bought a Alienware 17 r3 with a graphics amplifier. I have a gtx 980 ti in it at the moment and everything seems to work fine. The only thing i want to do is reduce the noise, so a future fan mod will be in the works.
My question is regarding the powering up of the laptop with the amplifier installed. Is it normal when you switch on for the laptop and amplifier to come on for a few seconds, then switch off and then on again?
Cheers -
For the R3 it seems normal afaik. My AW15 R1 started "normal" without going on and of twice (only if i hooked off the amp, started it and used it mobile and After that hooked it back on, it did a Short power on/off/on)
-
I just ordered a Noctua NF-A9 FLX fan, so hopefully that improves the crazy amount of noise the amplifier puts out lol. -
So I put that R9-290 in the amplifier and noticed some issues with heat. My games would run fine for the first 15-25 minutes but after that, they would drastically go down. I am talking about getting like 100+ FPS in battlefield hardline all the way down to 30 or so. The same thing happened with league of legends where I was getting around 400fps uncapped and it got down to 30fps in some battles mid-late game. Is this just due to heat from the card? I know the R9 290 is a hot card, but is that normal? I have not tried it in a desktop as I don't have one to plop it into but could it be something else? The card is clean and ran a 9172 firestrike score basically back to back. This is my first higher end amd card, I usually stick to nvidia and now I remember why.
-
Are u running MSI afterburner or similar and watching gpu temps? Should be easy to see if its the card getting hot or not? My 980ti runs cool even overclocked and under heavy usage in the GA box but my 980ti is a blower design which pumps the heat out of the case and doesn't just dump into the case. I could easily run my GA without the fan in the front if I wanted I believe. Ill bet its something besides heat thermaling the card but you'll have to log/watch gpu temps to rule that out first.
-
I actually had the GA on a small side table beside me and it got uncomfortably hot with the air blowing on me. -
Did we ever get the BIOS update to allow PCIE 3.0 on the new skylake models with the AGA?
I intend to buy an amplifier soon but haven't seen a BIOS update. If it's still PCIE 2.0 x4 i'm gonna be rather put out. -
-
-
http://www.dell.com/support/home/ca/en/cadhs1/Drivers/DriversDetails?driverId=CHDTXJemplayer likes this. -
Hopefully I can download it, update, then remove it. -
*EDIT* Frank Azor said that AWCC isnt the only thing thats needed for gen3! They will bring a BIOS-Update and a new (AMP??) Driver that are neccessary for it to work. ETA is next week if everything goes at planLast edited: Nov 12, 2015 -
I know g-sync is unavailable on my 17r3 due to optimus requiring the display to be directly connected directly to the dgpu. My question is does the graphics amplifier connect in such a way that the uhd igzo can run g-sync since or would it only work with a g-sync display connected to the display port on the amplifier?
-
Last edited: Nov 15, 2015
-
So I'm finally looking to order my AGA and a card (and a better monitor)
Intend to run at 4k, but not sure I can splash for a 980 ti currently. Will likely have to get 390x or a 980 for now, and upgrade later. (or I could get a 980ti with the AGA, but keep running at 1080p until I can get a monitor, but that seems like the worse scenario)
Does anyone have any specific insight in regards to the AGA and which card to choose? I know AMD cards were preferred due to driver issues, but I believe that problem has been solved?
I think I'd prefer a 980 as they generally overclock more, but any advice will be appreciated. -
My advice (I have no idea what I'm talking about) is to get the AGA now--they are widely available for $200--and the best GFX card you know will work. Don't buy a 4k monitor now because the prices are falling so fast it's tough to catch them. Keep your current monitor (I'm watching the Seahawks game on an HP 2311 (HD) I paid $130 for five years ago). Right now an Asus reference 4k monitor costs $1200 and everybody's cooing about how CHEAP they are--1/4 of last year's price! The Asus consumer 4k is $450 and by the summer I'll bet reference 4k monitors are close to that price.
Wild cards: Everybody knows the nVidia 1000x series is coming soon. Do you want the VERY best or the best now? Wild card #2: A consumer 4k has a lot to offer if you're not doing VFX in your living room. Like I said, I'm still not done with the cheap HP, but maybe I'll add in a cheap 4k and go with a 3-monitor setup? Only my credit card company knows for sure! -
Prices in my country for tech are super high, but it means a fantastic used market. Usually I sell GPUs for around 90 percent what I paid.
As for the monitor, I was planning to get a new xb281hk, which is roughly 3 times the cost of my 1080p monitor (doesn't seem so bad for 4 times the resolution plus g sync)
I upgrade A LOT, so doing so doesn't bother me. I'm more concerned with getting the best out of what I can currently afford.
It seems that a 980ti for 1080p would be overkill. I am leaning toward getting the 4k monitor and a 390x, since I can sell on the 390x just after new year for most of what I paid and then upgrade to a 980ti. -
I have an ext hdd plugged in AGA and only got around 38mb/s transfer speed while i tested using that hdd on A13 usb directly the result is 120mb/s. I had all the drivers updated, anyone has the same issue?
-
-
I'm curious to know how it performs for high refresh. Ideally, I'd like a 144hz 1440p monitor, but I was afraid that the combined issues of the laptop CPU and the PCI bandwidth would create a bottleneck at high frames.
I would expect that you would see less of a perdormance impact running 4k60 versus 1440p144.
Even when I was running a desktop with a 980ti, I found that dropping from 4k to 1440p would only take me up by about 20 frames.
It would seem silly the get a 144hz monitor if I end up only getting 60fps, especially if I could get roughly the same framerate at 4k. -
-
No need to go crazy with benchmarks. Just let me know your clocks (if OCd) and roughly how the frames are in a couple games I can compare to benchmarks.
Also, what CPU are you running?
EDIT: If the AGA really can push 144hz at 1440p im really heavily leaning toward a ROG swift again. I owned one previously and loved it.
The advantage there is I can put of getting an external GPU for a little while longer, because a 970m should be able to push 1440p 50+fps on med-high settings, which will be perfectly playable.
Whereas if I get a 4k ill either be dealing with non-native res or horrible framerates.Last edited: Nov 16, 2015 -
Here is the output from Unigine Heaven:
Unigine Heaven Benchmark 4.0
FPS: 107.5
Score: 2708
Min FPS: 26.6
Max FPS: 217.5
System
Platform:Windows NT 6.2 (build 9200) 64bit
CPU model:Intel(R) Core(TM) i7-4980HQ CPU @ 2.80GHz (2798MHz) x4
GPU model:NVIDIA GeForce GTX 980 Ti 10.18.13.5891/Intel(R) Iris(TM) Pro Graphics 5200 10.18.15.4248 (4095MB) x1
Settings
Renderirect3D11
Mode:1600x900 8xAA windowed
Preset: Extreme
60fps at Ultra/Max (2560x1440) in GTA V, Project Cars, MGSP, Fallout 4, Shadow of Mordor, World of Warships.
I'd expect to need to drop a notch or two at 4K or run SLI to stay at Max. I'm interested to see how it handles Ashes of the Singularity. I have a relatively weak CPU. -
http://www.dell.com/support/home/us...alienware-13-r2&languageCode=EN&categoryId=AP
For those of you with an AW13 R2, this should activate PCIe 3.0 for the GA. And a new BIOS update for the 15 R2 and 17 R3 just released, containing notes about the GA. -
Sent from my D5803 using Tapatalk -
This is a CPU limitation. -
Awesome. I've just ordered a AGA on ebay (but no actual GPU for it yet).
-
i can confirm that on aw17r2 ga works now with gen3 on nvidia cards
-
-
Should be around 7-10% but i didnt tested it
-
So i've finally settled on a 390x for my AGA, both are on the way.
I went with a sapphire 390x Tri-x. Has anyone used one? I'm concerned it might be a little long. -
Guys cpu of aw 17 6820HK can i you with it 4 to 5 year with amp and high end graphic in future? Without bottleneck
-
Can anyone actually measure the available space for the GPU inside the AGA?
THe quoted max length inside turns out to be an inch less than the new card I ordered. I know they are often conservative with these, so before I waste the time and money to return my card I'd like to know if it might fit. Once it's opened, I can't return it. -
I have the AGA with a R9 390, but I am not seeing it recognized when I do a 3D Mark test, it comes back as a generic VGA video card. A small little thing that I am curious too see if it is normal or if I have done something wrong. It is labeled right in the device manager, and the 3D Mark system scanner is up to date. Is this just a product of the AGA being used?
Any insight will be helpful, thanks. -
Generic VGA also isn't new (I got on both Nvidia and AMD at one point), but it doesn't affect the overall performance of the GPU. -
As time progresses I am thinking more and more that my 17r2 was a fantastic decision. I can upgrade it as long as new desktop cards are being produced.
-
is it possible for alienware to release another edition of the amplifier that will be x16 speed and work with the current proprietary socket they designed? Or does the proprietary socket limit it to x4? or possible for a future thunderbolt egpu solution. I just purchased a alienware 15 inch with 980m 4gb, i7 6820k skylake processor, 16 gb ram. Will i be able to game with this at 1080p at mid-high settings for atleast 2-3 years?
*OFFICIAL* Alienware "Graphics Amplifier" Owner's Lounge and Benchmark Thread (All 13, 15 and 17)
Discussion in '2015+ Alienware 13 / 15 / 17' started by Mr. Fox, Dec 10, 2014.