Although the case is already too narrow the way its layed out, this sounds like a great mod,.
I remember reading somewhere that actual Wattage needed from 980ti is much lower than TDP, I believe it was around 175W draw from power supply at full utilization but i cant find it anywhere, what dimension were you aiming for? you should keep us posted on progress
My starting thoughts would be to put the case by itself on a power meter, and check power consumption with stock power supply etc that would give you an idea of current needed, and the perfect final product would utilize and external power supply like the 18's 330W for example, meaning the enclosure would be roughly double the size of a 3.5HDD case.
-
http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-ti,4164-7.html
And another possible way to put a power supply would be using a 1U, SFX, TFX, etc. with the GPU stacked on top, in this way a higher wattage PSU can be used while keeping the width about the same as the card, and making for better portability.
Also, the length of the case could be reduced to 7" since new graphics cards using HBM are all likely going to be half length like the new fury series.
Last edited: Sep 26, 2015 -
-
At this point, I think it might be worth while to just hold out for a few months since there will likely be some thunderbolt 3.0 eGPU's coming out which will possibly have thinner profiles. Also, there is rumor of Oculink making its debut in the fall of 2015 also, which Alienware has being rumored to be using with the Graphics Amplifier, though they probably got access to an earlier spec of Oculink which only supports x4.
Oculink
http://www.kitguru.net/components/a...alize-oculink-external-pci-express-this-fall/ -
So I ended up spending a little extra and getting the 15 r2 with skylake, as with an i7 6700 and a 970m (and ddr4 RAM, which i didn't realize it came with) is actually only 200 bucks more than the similarly specced r1 I was intending to buy (despite it being advertised as 500 dollars off).
I sold my desktop to pay for this though, and it looks like it will take 2 weeks to arrive. Need to find something else to keep me busy for a while.
Really curious to know if skylake improves GA performance though, as it apparently Turbos more consistently due to better thermals. Sounds as though a bit of an overclock might be enough to get very close to desktop level performance.
I won't be getting a GA until next pay month due to other commitments (tattoos are expensive..) so I hope the extra cost will have been worth it. -
Due to my current 17 R2 failing after mobo replace etc, I am pushing to replace with newer model or refund, I now am thinking because i do very light gaming when traveling (league of legends) should I just get the alienware 13 (latest model) and use my current GA with GTX980.
Do you guys think there will be a big drop in performance? are the newer models better?
I guess my question is, which option?
1. Skylake 17 with GA 980
2. skylake 13 with GA 980 (and save the difference for additional ssd or something)
3. XPS 13 (will this run lol?) and a mATX desktop to put my 980 in (around the same budget with 6700K etc.
Also the 15 has a graphics option of m395x which bumps cpu to a i7-6820HK, so that might solve the CPU bottleneck and driver issues with the GA i was havingLast edited: Sep 28, 2015 -
-
-
-
-
Here's a question I don't see adressed anywhere;
The GA seems to use the same PCI connection for the GPU and USB devices. Given the GPU is likely saturating most of the bandwidth, does having USB devices hooked up to the GA impact performance?
If you have a USB soundcard, USB 3 HDD and mouse and KB hooked up to it, I imagine that could be a lot of data. If this impacts GPU performance then a separate USB hub may be the way to go. -
The GA cable feels like it has 2 separate cables in it when i squeeze it.
Gaming doesnt affect usb 3 hardrive rate but the GA isnt necessarily fast for USB, i would see it as a glorified HUB going to a single usb3 port, I have seen over 250MB/sec going to USB 3 ssd's etc, but going to multiple 2.5" spinners, it cant max them.
Also the Devices when GA is plugged in is limited, I have found that 16 is the max, for example if I have 16 devices plugged into GA's 4 USB 3 ports using HUB etc, the ports on laptop will fail with anything I plug into them.
Another interesting this is that the Gigabit port on laptop seems to register as 2 different devices with and without GA, i.e. Killer #1 and Killer #2 when GA is plugged in, seems it has something to do with the rerouting also but I definitely havent seen an issue with bandwidth with that as i saturate almost all the time, but a couple times a week is will lock up alot freezing all network activity and require a reboot, tried different drivers etc and disabling killer bandwidth control, i just assume its not made for my workload. -
-
-
So I have a new 17r3 coming and have a GA that I plan to use. What seem's to be the best to use in these for things to work right without driver issues etc? Nvidia/AMD? I have a 980M coming in the 17r3 and plan to run gtx970-980ti in the amp to drive my external display.
-
-
So I would be better off with a AMD card in GA by that logic then I suppose? Like 390x or fury I guess, what is known to fit? Thanks
-
Last edited: Sep 29, 2015
-
Thanks, maybe the new nano would make the most since. I'm getting older and just like things to work for the most part, don't want a bunch of driver conflicts.
-
-
Nano costs more for less performance, I wouldn't bother. If an R9 fury will fit, it's a good choice.
I'm planning to get a 970 and later on a 980 ti, but i'm not sure if i'll regret that choice with the driver issues. -
-
What is it exactly that causes the driver issues? I have wondered how it all works with the GA. So like in my case I will have the Intel on die graphics, I will have a mobile driver from Nvidia for the 980m and then I will also have a regular driver package for what ever card be it AMD or Nvidia thats in the GA? I could see how this stuff could bring issues for sure if this is how things are set up. I guess I will figure it out once my stuff is here. I would prefer to go Nvidia in the GA but not if its a pain to get to work right with the rest of my gear.
-
Here in NZ, Nvidia cards are cheaper than AMD At almost every price point. at the high end, AMD cant even touch them. the 980 ti is vastly superior to a Fury X and they cost the same.
Once I get a card and start looking into it, i may be able to do something about the driver issues. I can't imagine it would require much more than some INF trickery to get them working. Not sure if anyone has really looked into it yet? -
You have a MSI 390x in your GA? Fits OK? What resolution do you play at and are you happy with performance? Thanks
-
-
Driver problems won't be easy to fix
It's a card detection issue only fixed by using old dell driver and forcing Windows update to ignore new drivers
If you don't mind old driver, and don't mind missing out on the latest game optimisations then nvidia is fine and overclocking will outperform everything
If you want latest drivers with no issues and not going to overclock the fury x is the way to go as its performance is between 980 and 980ti and only a couple fps behind the latter when both at stock clocks
Another solution is to get the amd 395m card in the laptop that way you get much better cpu and can run anything you want in ga with any driver -
Ati driver doesn't have this issue -
The problem discussed is more relevant on Win10 because of its forced updates. Win7 and Win8.1 don't really have forced-updates (although, some optional Nvidia drivers from Windows Updates can sneak by).Last edited: Sep 29, 2015 -
Thanks guys, makes more since to me now.
-
What's the actual workaround for using new drivers for the desktop, is it to reinstall drivers any time you disconnect the GA? Not ideal, but doable if the drivers are kept somewhere easy to access.
Starting to question if this was actually a good purchase now. I had gotten rid of my desktop and laptop thinking this would be able to serve both purposes. -
-
Will probably not be able to do 4k on it though.
Does anyone have experience pushing high refresh rates on a laptop CPU? I would expect the CPU limitation will be more apparent pushing 144hz?
Worth bothering? -
Refresh rates are not a concern if you go with a g-sync or free sync monitor.
-
-
Which model 390X is known to fit the GA properly? So to clarify, if running a nvidia gpu in the laptop and a nvidia gpu in the GA then you run only the nvidia mobile driver and thats it. If running the nvidia gpu in the laptop and a AMD gpu in the GA then we will run a nvidia mobile driver for the laptop and then the AMD desktop drivers separate for the GA and this setup seems to be less prone to issues?
-
It's better in every circumstance.
http://www.howtogeek.com/228735/g-sync-and-freesync-explained-variable-refresh-rates-for-gaming/ -
My point is that getting a 144hz monitor is pointless if the laptop can only push 60fps. Regardless of whether you have g-sync or not.
I would rather have 144hz than 60hz, but if the CPU limits the maximum framerate, i'd be better off sticking to 60hz and a higher res (4k) as the GPU is a more important factor for this. -
I would get either an R9 290x or an R9 390 over an R9 390x. Better price and close enough performance. But if you want an R9 390x, then VisionTek's (usually a safe choice, like PNY for Nvidia), Gigabyte's (maybe this one too), and this PowerColor one should fit. There are more options from the R9 390 and R9 290x (and if you want to be cheaper, the R9 290 is also an option). -
The 390x is only 100 NZD more than the 390. Performance wise, I think it's the right call. The 390x is already much less than I'd want (would get a 980 ti), but with the driver issues i'm forced to AMD, and the 390x seems to be by far the best price/performance card at that level. -
to put it into perspective, prices in NZ:
980 ti: 1200
Fury x: 1270
Fury: 1035
390x: 700
390: 600
970: 600
So the 390x is waaay cheaper for than a fury for the performance it offers. -
I'm never going to understand pricing over at AUS/NZ.
-
It always gets messed up. -
No, I mean how it gets so expensive. I know why, I just don't get how (I need the numbers to understand how).
-
Your monitor refresh rate and your GPU (not CPU), refresh rate are not the same thing. If the GPU can't provide a constant fixed refresh of 30 or 60 or 144Hz, then you had tearing (no v-sync) or stuttering (with v-sync).
Both G-Sync and FreeSync eliminate this problem by varying the refresh rate.
http://www.geforce.com/hardware/technology/g-sync/technology -
Hey guys got a question for you. So I am looking into buying a GA (I have the A15) but I am trying to lock down a good card prior to buying it. I know nvida is having some driver issues so I want to stick with AMD. I am looking at the new R9 Nano, but I haven't heard much about it, specifically in the GA. Anyone have one in there GA? Or does anyone have a recommendation?
I have heard things about the hyper X but I am worried about it fitting in the case, does it? I dont want anything to be sticking out etc.
Budget is anything goes, as long as its good.
thanks. -
RatioKiller likes this.
-
I'm not sure how you think g sync fits into this. I'm just talking about the fact that the cpu is more important the more frames you render, and so if you have a weak cpu that can't feed the gpu fast enough to get 144hz at decent settings, there would be no point getting a 144hz monitor. -
Does anyone happen to know what the bottleneck of the CPU's would be? Also I have the "Old" alienware 15 so i got the i7-4710HQ chip. I ask because I am all for spending the money for a nice graphics card but if my CPU isn't going to max it or near max it out, there is no point..
Side note: In the future I plan to buy either the steam VR or Oculus Rift, and I would like a setup that could handle it well.
Thanks -
*OFFICIAL* Alienware "Graphics Amplifier" Owner's Lounge and Benchmark Thread (All 13, 15 and 17)
Discussion in '2015+ Alienware 13 / 15 / 17' started by Mr. Fox, Dec 10, 2014.