gotta wait for 20nm, is it 20nm?
-
It's still 28nm. -
According to that Maxwell thread, apparently laptop Maxwell will/might be 20nm, while desktop Maxwell will still be 28nm. So who knows lol
-
-
Yeah I gave up speculating and following that thread. As long as I get my promised desktop 980 that performs 10-15% better than 780Ti while costing <$500 I'll be one happy camper. Otherwise I'm returning all the desktop parts I ordered and rage quitting LOL
unityole likes this. -
I noticed that the CNET review explicitly mentions the Alienware 13 using the same 860m that is in the Lenovo Y50t model - anyone know which 860m this is since there is concern which one might end up in the AW 13?
-
It's Maxwell. It should be the same one in the ROG 750JM as well.
At 1080p, you should see similar performance to this:
Last edited by a moderator: May 12, 2015reborn2003 and Mr. Fox like this. -
I believe both have a 4700HQ but the newer one's CPU seems to be running about 10°C hotter. The newer GPU was about 6°C cooler. Makes me wonder, depending on how much older it is, whether or not dirt and lint have collected in the heat sink on the older one. Only speculation, but GPU temps could be very similar and the CPU much cooler on the older Asus. I wonder if they changed their cooling system at all. All things considered, not enough difference in performance or thermals between them to justify "upgrading" to the newer one with Maxwell 860M as demonstrated in this video. Based solely on this video the Kepler 770M seems like the better GPU and has 1GB more vRAM (3GB for 770M versus 2GB for the 860M).
Edit: Here is a thread that follows that video comparing the two models. G750JX VS G750JM (770m vs 860m). Looks like the video author thinks the older machine is better and not very impressed with the newer machine or Asus support (no real surprise about that). "Anyways laptop reach 85-86c everytime i play a game for like 1-3 min..." doesn't sound very good for such conservative hardware specs. -
ASUS support has never been that great. It's a shame, really.
-
Not sure how reliable the comparison is, but it was interesting that in comparing the 860m vs 770m on GpuBoss.com, they show nearly identical results for Crysis, but the 860m is substantially faster faster (42.37 fps vs. 29.43 fps respectively).
-
GPU Boss is notorious for publishing inaccurate information. They might get better in time, but a lot of what I have seen posted there is flawed misinformation. It seems to be a repository for information that has not been validated for accuracy... kind of like relying on Facebook for facts.
There are not many good places to turn to for accurate information backed by performance test results. If all you want to know about is click-and-run stock performance, then Passmark's CPU/GPU database and Notebookcheck.net will work in a pinch.
Although I do not know for certain if results with the Maxwell 860M is in the Passmark database yet, they have a nice performance comparison feature. Searching 3DMark's web site for benchmark results is (I think) the best way to tell how well hardware performs.
Here is the highest posted Fire Strike benchmark result using that same model ASUSTeK COMPUTER INC. G750JM shown in the video with a Maxwell 860M overclocked.
NVIDIA GeForce GTX 860M(1x) and Intel Core i7-4700HQ
Here is an overclocked 770M SLI versus 860M SLI Fire Strike run: NVIDIA GeForce GTX 860M(2x) - 5775 3DMarks versus NVIDIA GeForce GTX 770M(2x) - 7187 3DMarks
Wow, that Clevo P375SM with 770M SLI totally emasculates the wimpy Aorus (Gigabyte X7V2-CF1) toy with 860M SLI, LOL.
Here is a comparison of 860M (possibly the Maxwell variant) with 770M using the Passmark database.
To help put the CPU performance into context with something many of us are familiar with, this is the approximate performance equivalent of an i5-4210U. The reviews mention a choice between i5 and i7 ULV processors will be available. This example of such an i5 ULV CPU is pretty pathetic. Since they are BGA there probably will not be a huge variety of motherboards, maybe 2 to 4 options at most I would guess. Could be interesting to see what the choices are.
-
So, at least in terms of that benchmark both cards one heck of a step up from the 765 currently in use (2114 on Passmark). I was aware of GPUBoss.com's spotty performance data which is why I couched my question concerning it, but it does make me wonder if the benchmarks are correct if this slight difference in some cases but significant in others is a result of differences in speed and architecture, so that the 860m is at least equal in some cases but then better in others. If the heat load is good, it seems like fair trade-off.
Not sure about how hot the 770m runs relative to the 860m (I would expect the latte to run cooler), but in any event it's a serious step up from the current AW 14's top end GPU, so I unless the price is seriously out of line, I really can't see a reason to be hating on the 860m in the AW 13. -
Oh, yeah... They are both at least 20% better than the 765M. They can also be overclocked a good 10%-15%.
The 860M runs very cool. I had a 750JM for a while and it did not reach above 71C with stock paste. The new Maxwell cards should be a very nice upgrade from Kepler. -
If those two Asus systems in that video are basically the same machine (same chassis, heat sinks, etc.) it might be an indicator, but as long as it doesn't overheat it really doesn't make any difference if one runs a few degrees warmer than the other. If the temps are good then there is plenty of wiggle room before they can be viewed a "bad" in a relevant way. Cool is preferred, but cool enough is really all that matters.
I think overall system performance will have a lot to do with the CPU. Having a respectable GPU will not be adequate to make up for an anemic CPU. Hopefully, they will offer an i7-47XXU for 4C/8T performance at least up to 3.5GHz. We will not know for certain what the CPU options are until Alienware actually shows us. The information in the reviews may be off base on some things. While most games are primarily GPU dependent, there are some games (and other software) that will take a pretty big hit on performance with a weak CPU. A dual-core ULV CPU would be a massive reduction in performance compared to the 3840QM in your M14xR2, Docsteel. I think you would notice it and probably not be very happy with the end result.
We should have some answers soon... probably won't be too much longer until the web site opens for pre-orders and allows us to have a gander at the configuration options. Even though little machines like this are not my cup of tea, I would still like to see Alienware staying out in front of their competition and bowing to none of their competitors where performance is concerned. -
didnt think anyone would use GPU/CPU boss as reference lol
-
Sometimes they base their information on NotebookCheck, which I know is a more legitimate source. But I never refer to GPUBoss as my sole reference. Nobody should. You should always have more than one to present accurate information or an argument.
-
I definitely agree with the "cool enough" if we are talking component wear, etc. being essentially equal.
As for the Dual-core vs Quad, I definitely know it's a compromise in terms of performance (almost daily I move back forth between a desktop with an Extreme hexacore processor and SLI Titan's down to a lowly M11x-R3, M14x-R2 is normally for on the road work). I would _prefer_ a full on i7 personally, however, I have to be honest, even taking it on the road and using with work, I rarely do more than two to three apps at a time, or say a game with music streaming, which is probably quite a common usage pattern. ULV-dual cores handle this fine, and few games make enough decent use of more than what a 4 logical cores can provide for gaming purposes. -
Lol - the point wasn't how reliable they are or their relative ratings, it's that the two FPS counts are probably easily checked and if correct (from what I have seen in other sources they are so far), then it does point to a case where it may depend on how CPU vs. GPU bound a game is as to what difference the two GPU's are going to make. Honestly, I didn't think anyone would stoop low enough to criticize a casual reference -
may need to this with a grain of salt.
I was talking to an alienware tech about a completely unrelated isssue, and the topic of new laptops came up. I mentioned the Alienware 13 that I might pick one up, and he responded Alienware doesnt make a 13" laptop. I then sent him the link in the beginning of this thread and he seemed very surprised. After looking over for a bit I said, the laptop looks great, but I don't know what that extra port in the back is for. He took a look at the picture of the back and started to giggle, and said I know what it is for. I began to ask how since you didnt even know about the laptop and he said that alienware R&D has played with that port before and thats how he knows what its for.
I then began to pester him to tell me to which he said the calls are recorded and so couldnt disclose that information, but when I asked if it was an external GPU port or something he hesitated to say no and laughed some more. So those who guessed that it might be a port to support an external GPU, you may be right. -
I thought Alienware would have a real Razer Blade, Aorus X3+, MSI Ghost competitor, but it's still not quite the package to compete. 860m is good, but there will likely be an 860m replacement with the 900m series in the next couple of months.
860m Maxwell will trounce the Kepler version Watt for Watt. Maxwell runs cool and can manage 1080p and less perfectly fine, and can be run near stock 870m performance with an overclock, still running less than 120W for the system and < 75C for the GPU. The ULV CPU will definitely kill any advantage though that any reasonably powerful GPU can offer. I like that the likes of Razer and MSI and Aorus are pushing the thinness limits, but they also need to step back a bit and work on a fully integrated solution and not just going as thin as possible for the sake of doing it. The Razer Blade 14 I tested was very nice, but it also would burn your legs, and possibly your fingers if you had your hand resting in the wrong spot. It just didn't make much sense.
The AW 13 seems like they cared about thermals and allowing the system to breathe. It may be the first system to actually have been designed to be thermally optimized in the small and thin form factor.Mr. Fox likes this. -
I'm also not convinced that an 860m (or ULV) is all we will ever see, if the 900 series is truly that close.... again, wouldn't surprise me at all that we see a replay of the M11x-r1 followed by what AW/Dell wanted to release but the chips weren't ready, the M11x-r2, so that one slightly underwhelming version is rapidly followed by a refresh with Broadwell and a 9xxm. Of course, people will still go off on it being ULV wanting a lap toaster instead
-
its 99% likely now that A13 will use external GPU since it's a thing now
http://forum.notebookreview.com/msi/760966-msi-gs30-revealed.html -
Eh hopefully the "docking station" isn't that big though. It looks really big. It looks interesting but lugging around a dock like that would be annoying. Makes sense if you're leaving it in one location to dock later, but for taking it around all the time, the AW18 seems a bit more convenient lol.
-
That whole idea is weird to me. It contradicts the whole point of the AW 13 - extreme portability.
In my opinion, having a "docking station" on a desk where you must plug the AW 13 into also doesn't seem very attractive. A 13" screen on a desk, far away from your face, will be too tiny to game on. The AW 13 is not a desktop replacement. It is supposed to be a gaming laptop, literally. Maybe they have other plans for it. Who knows... -
Meaker@Sager Company Representative
It would be with a large external display at home.
-
I think the point of it is that you can have the docking station, is that you can have a monitor, keyboar, mouse, and this "dock"
When you get home you can just plug the aw13 into it and game on a big screen with high power.... then pick it up and go.
I think its a great idea and look forward to picking up the 13 when its out.reborn2003 likes this. -
It would have to be a lot of performance for me to be interested in something like that.
-
-
reborn2003 likes this.
-
I sincerely doubt the AW 13 will be using an external GPU dock though. There is zero actual evidence of that being the case, and tons of evidence pointing towards it having a GPU inside.
-
My thoughts exactly, Doc. The processors in these tiny 13" systems will bottleneck the heck out of a 780Ti, lol. This whole "dock" idea is hilarious.
Just noticed: "Doc" and "Dock", lol. -
With the ASUS XG station I have it has a 8800gt in it, and I can upgrade it. Just haven't done it. (one of these days I am thinking about trying a 750ti in it) It's limited to the bandwidth of the express card slot so the card can't play at full potential, BUT it does boost the ablility to play games on any laptop with a express card slot. I wanted to play with them when they were due out but they were only sold in the land down under. Then they quit selling them. About 2-3 years ago one turned on on ebay for about 150 bucks. I got it just to experiment with. It does expand what you can play at home, but still isn't a replacement for a full on gaming desktop. But it isn't meant to be.
I think a external dock that can expand what you can play would work, then take the same machine with you with all your files already on it. For me something like this could work because of the way I travel. Oil rig worker in Saudi Arabia. That is why I am toting the R1 M11x with me now. We have weight restrictions on the helicopter and they are firm about it. So something like a 17/18 would be out of the question weight wise right now. I know my old M9750 would be a no go on the flight.
I would like to see a little stronger CPU in it though, at least as a option. Just waiting to see what they turn out.
Older single core thinkpads have a advanced dock that has a pci-e desktop card slot in them. I think they stopped it when the dual core came out. Several people modded them to better cards, not just quadro type cards.
Looking at the new MSI dock it looks massive. The thinkpad one was a lot shorter but the pc went on the top. My XG Station will fit a lot of places as the only connector from it is the express card slot and it could be hidden if needed.reborn2003 likes this. -
"My laptop is smaller than yours, nah, nah." *sticks out red Kool-Aid stained tongue.*
"Yeah, well, mine gets 2 FPS more than yours in Flappy Bird, and gets an extra 5 minutes of battery life."
"Does not!"
"Does too!"
-
If DTR's keep getting neutered I may be a unique position for eGPU's to actually suit my circumstances. But there are caveats They really need to saddle that tiny laptop with a CPU that can stand the test of time and a more generous PSU than the 65 watts it's got right now. 15" screen. That dock also needs to be offer SLI too otherwise I'm going backwards...can't have that!
I actually get the idea of a docking station since I can take the laptop section to work or short trips where I tend not to use horsepower anyway. Then sit it on (near?) the gpu's dock when ever I game at home which is usually in the same spot. I could get used to that.
Things not to like is.. it's more of a headache to travel/move about with if I actually want to game, bench or simply if I generally enjoy keeping all my faculties intact. The convenience factor is only good for as long as you don't want to use any horsepower. Hmmm...Mr. Fox likes this. -
Meaker@Sager Company Representative
I'd still want a little GPU on the move, an 850M DDR3 class device would do, make it be able to clock up on battery as much as possible. With an external dock, let it support a closed loop with the radiator outside the enclose and I would be happy lol.
-
Yeah, if they are going to bother with having an external device they might as well let people have their way with it. Hopefully they don't go do something dumb like stick a propitiatory 200 watt PSU inside.
Mr. Fox likes this. -
Only dual overclocked 880m's can compare to the performance of one NON overclocked 780ti. Also if you have 1 GPU you don't run into the issues that can sometimes crop up with SLI (screen tearing, some games not recognizing second card, etc...) -
Single or multi-GPU boils down to one thing only... user preference. OK, maybe two if we count budgetary constraints.
The only problems I ever have with SLI are caused by lazy game developers doing a sloppy job at their trade, LOL. Even so, when you have to deal with that kind of incompetence you just turn it off until they fix it, or get rid of the game if it runs bad even with a single GPU. It's not a problem to deal with either way. GTX 780M SLI holds its own against 780 Ti. It spanks pretty much all of the lesser AMD and NVIDIA desktop GPUs in a single GPU configuration, as well as many, if not all, of the more budget-friendly desktop GPUs in an SLI or CrossFire configuration.
It does mean you're losing out if you don't want one GPU. I would call it losing out if it was for me. I would not want a desktop or a laptop with a single NVIDIA GPU any more. I think SLI is just way too excellent to live with only one GPU. CrossFire isn't always great, but CrossFire bugs are not a valid reason to avoid SLI. For me there is no point in spending a dime on a desktop that doesn't stomp my M18xR2 and it takes an expensive desktop configuration to accomplish that. You cannot build a more potent desktop on a lunch-money budget.
I haven't encountered the sound card issue before... I saw where a single GPU laptop had that problem when upgrading to a GPU it was never intended to have, but I have not seen it directly attributable to a multi-GPU setup. Are you speaking of an IRQ conflict or something like that? I had run into that a very long time ago, but it was a inexpensive and buggy motherboard in an older desktop.TBoneSan likes this. -
It will be interesting if AW/Dell actually tries an external docking station with a GPU to takeover, but it's utterly pointless with ULV chip on the laptop, even in my opinion on them not being altogether a bad thing for light-and-on-the-run gaming. Like one poster said too, you have to watch AW/Dell on the power supply end of things as well, I could easily see them pulling what they did with the Area 51's with a proprietary cable (way to be cheesy and greedy Dell).
I actually have more hope now that a full-on i7 is possible with the AW 13 if an external GPU is in the cards... as cool as it is my desktop would still trounce it, so it's not for me, but I laud them for doing it if it works out.
I could also see Dell doing this: outfit a closed environment docking station so it is a fixed card to match what is in the laptop and it becomes some mutant form of SLI, all nice and proprietary the way they like it. Not sure if a ULV-processor even with SLI 860m's wouldn't still be CPU-bottle-necked though, anyone know? -
There's not a whole lot I can add to what Mr Fox has so eloquently put. I've found my 780m SLI overclocked to bring the fight to a desktop 780ti even overclocked, only to loose out to first page of the top 100 on a 3D Mark 11 leader board.
If I were changing machines I'd want it to be a meaningful upgrade. I don't see the point in otherwise. If I'm not moving forward, I'm going backwards. So going to such a device would need to let me have my way with SLI to make it worthwhile.
Single GPU tends to work out the box for games better but I've always been able to fire up both my cards one way or another and have had a lot of satisfaction doing so. So performance is still what I'm lusting after here as I'm yet to have any sound issues what so ever from sporting SLI.Mr. Fox likes this. -
I make a bet: Not ULV cpu and no dedicated card. BUT external solution with an MXM gpu. What can I win?
-
-
-
They will have a 860m or dedicated gpu for sure. Every hands on review so far that got to play with the AW13 said it had a 860m maxwell gpu. The ULV cpu is the component that is still not 100% confirmed.
A 13 inch laptop with a 860m and the ability to use an external gpu would be a very interesting product. The 860m alone should do great at 1080p. I'm sure we'll see a broadwell/960m model in 2015 as well, which should give the laptop a nice bump in speed.
It would be interesting if the port on the back is a proprietary alienware dock port with thunderbolt 2 built-in. The dock could have gigabit ethernet, usb3 ports and the ability to use a external gpu(maybe even external displays). That would be a winner in the dock market. Take the laptop on the go with decent performance and hook it up for maximum performance. The only problem with that would be an ULV cpu bottlenecking a high end gpu like the 970/980 etc.
As for a single desktop gpu being more powerful than two laptop gpus, that makes perfect sense. We have two laptop gpus with a total of 200w tdp being able to compete with a single 250w desktop card. The laptop solution uses less power and adds a huge portability factor. The desktop 980 is rumored to have a 170w tdp and be 10% faster than a 780 Ti. If that turns out to be true, we're talking about a small 10% performance increase, but a nice 32% power savings, which will help shape the 980M to be a powerful gpu. I can see the 980M being a full 980 with just reduced clocks. -
Okay. 860m then...
-
I like the idea of the dock, but personally, it wouldn't work for me. I want the performance wherever I go, even around the house. I guess if they offer an optional dock for eGPU but still include 860m (or soon to be released 960m), it would be OK. Give you reasonable performance on the go with no compromise at home for those that want it. But a ULV CPU would not cut it for a high end GPU anyhow. Minimum full voltage (i.e. 45-47W) i7 mobile quad. -
Karamazovmm Overthinking? Always!
-
I agree that dock doesn't need SLI. That need is more of a personal requirement in my lust for overkill. -
I don't see the point in an external GPU... just get a bigger laptop with internal GPU instead.
Also there's talk of NVidia skipping the 800 desktop series altogether and going straight to 900 series with new architecture and releasing in October, the reason being to synch the architecture between the desktop and mobile GPU series, so who knows, perhaps we'll be seeing a 900M series sooner than expected too, with new architecture. Wouldn't that be nice... and surprising, considering AMD aren't exactly forcing NVidia into it...Mr. Fox likes this. -
Yeah, it's a fancy gimmick a few of the OEMs are experimenting with. I'm with you, though... either make something truly awesome that stands on its own killer performance abilities without having to resort to morphodite contraptions, or just forget about it. Having a powerful desktop GPU in a box sitting on a desk that is not portable won't make up for a featherweight "gaming" system that has overall poor or mediocre performance. They are trying to compensate for a lack of something, and that seldom works well. It should stand on its own as a high performance laptop, sink or swim. If it can't, then sell it based upon what its own hardware specs tell us that it actually is... a thin and light system that is capable of playing games as long as the graphics settings are conservative. Nothing wrong with that if you want it, buy it. But let's not pretend it's something it's not by making a FrankenPC out of an Ultrabook.
Alienware 13 Pre-Release Speculation Thread
Discussion in '2015+ Alienware 13 / 15 / 17' started by tinker_xp, Aug 8, 2014.