The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Alienware 13 Pre-Release Speculation Thread

    Discussion in '2015+ Alienware 13 / 15 / 17' started by tinker_xp, Aug 8, 2014.

  1. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    gotta wait for 20nm, is it 20nm?
     
  2. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581

    It's still 28nm.
     
  3. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    According to that Maxwell thread, apparently laptop Maxwell will/might be 20nm, while desktop Maxwell will still be 28nm. So who knows lol
     
  4. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    lol the hell that makes no sense. so tonga and maxwell 20nm coming next yr maybe
     
  5. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Yeah I gave up speculating and following that thread. As long as I get my promised desktop 980 that performs 10-15% better than 780Ti while costing <$500 I'll be one happy camper. Otherwise I'm returning all the desktop parts I ordered and rage quitting LOL
     
    unityole likes this.
  6. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    I noticed that the CNET review explicitly mentions the Alienware 13 using the same 860m that is in the Lenovo Y50t model - anyone know which 860m this is since there is concern which one might end up in the AW 13?
     
  7. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    It's Maxwell. It should be the same one in the ROG 750JM as well.

    At 1080p, you should see similar performance to this:

     
    Last edited by a moderator: May 12, 2015
    reborn2003 and Mr. Fox like this.
  8. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,624
    Trophy Points:
    931
    I believe both have a 4700HQ but the newer one's CPU seems to be running about 10°C hotter. The newer GPU was about 6°C cooler. Makes me wonder, depending on how much older it is, whether or not dirt and lint have collected in the heat sink on the older one. Only speculation, but GPU temps could be very similar and the CPU much cooler on the older Asus. I wonder if they changed their cooling system at all. All things considered, not enough difference in performance or thermals between them to justify "upgrading" to the newer one with Maxwell 860M as demonstrated in this video. Based solely on this video the Kepler 770M seems like the better GPU and has 1GB more vRAM (3GB for 770M versus 2GB for the 860M).

    Edit: Here is a thread that follows that video comparing the two models. G750JX VS G750JM (770m vs 860m). Looks like the video author thinks the older machine is better and not very impressed with the newer machine or Asus support (no real surprise about that). "Anyways laptop reach 85-86c everytime i play a game for like 1-3 min..." doesn't sound very good for such conservative hardware specs.
     
  9. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    ASUS support has never been that great. It's a shame, really.
     
  10. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    Not sure how reliable the comparison is, but it was interesting that in comparing the 860m vs 770m on GpuBoss.com, they show nearly identical results for Crysis, but the 860m is substantially faster faster (42.37 fps vs. 29.43 fps respectively).
     
  11. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,624
    Trophy Points:
    931
    GPU Boss is notorious for publishing inaccurate information. They might get better in time, but a lot of what I have seen posted there is flawed misinformation. It seems to be a repository for information that has not been validated for accuracy... kind of like relying on Facebook for facts. ;)

    There are not many good places to turn to for accurate information backed by performance test results. If all you want to know about is click-and-run stock performance, then Passmark's CPU/GPU database and Notebookcheck.net will work in a pinch.

    Although I do not know for certain if results with the Maxwell 860M is in the Passmark database yet, they have a nice performance comparison feature. Searching 3DMark's web site for benchmark results is (I think) the best way to tell how well hardware performs.

    Here is the highest posted Fire Strike benchmark result using that same model ASUSTeK COMPUTER INC. G750JM shown in the video with a Maxwell 860M overclocked.

    NVIDIA GeForce GTX 860M(1x) and Intel Core i7-4700HQ

    Here is an overclocked 770M SLI versus 860M SLI Fire Strike run: NVIDIA GeForce GTX 860M(2x) - 5775 3DMarks versus NVIDIA GeForce GTX 770M(2x) - 7187 3DMarks

    Wow, that Clevo P375SM with 770M SLI totally emasculates the wimpy Aorus (Gigabyte X7V2-CF1) toy with 860M SLI, LOL.

    Here is a comparison of 860M (possibly the Maxwell variant) with 770M using the Passmark database.

    Compare-860-770M.JPG

    To help put the CPU performance into context with something many of us are familiar with, this is the approximate performance equivalent of an i5-4210U. The reviews mention a choice between i5 and i7 ULV processors will be available. This example of such an i5 ULV CPU is pretty pathetic. Since they are BGA there probably will not be a huge variety of motherboards, maybe 2 to 4 options at most I would guess. Could be interesting to see what the choices are.

    CPU-Compare.JPG
     
    HTWingNut and Caladdon like this.
  12. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    So, at least in terms of that benchmark both cards one heck of a step up from the 765 currently in use (2114 on Passmark). I was aware of GPUBoss.com's spotty performance data which is why I couched my question concerning it, but it does make me wonder if the benchmarks are correct if this slight difference in some cases but significant in others is a result of differences in speed and architecture, so that the 860m is at least equal in some cases but then better in others. If the heat load is good, it seems like fair trade-off.

    Not sure about how hot the 770m runs relative to the 860m (I would expect the latte to run cooler), but in any event it's a serious step up from the current AW 14's top end GPU, so I unless the price is seriously out of line, I really can't see a reason to be hating on the 860m in the AW 13.
     
  13. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Oh, yeah... They are both at least 20% better than the 765M. They can also be overclocked a good 10%-15%.

    The 860M runs very cool. I had a 750JM for a while and it did not reach above 71C with stock paste. The new Maxwell cards should be a very nice upgrade from Kepler.
     
  14. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,624
    Trophy Points:
    931
    If those two Asus systems in that video are basically the same machine (same chassis, heat sinks, etc.) it might be an indicator, but as long as it doesn't overheat it really doesn't make any difference if one runs a few degrees warmer than the other. If the temps are good then there is plenty of wiggle room before they can be viewed a "bad" in a relevant way. Cool is preferred, but cool enough is really all that matters.

    I think overall system performance will have a lot to do with the CPU. Having a respectable GPU will not be adequate to make up for an anemic CPU. Hopefully, they will offer an i7-47XXU for 4C/8T performance at least up to 3.5GHz. We will not know for certain what the CPU options are until Alienware actually shows us. The information in the reviews may be off base on some things. While most games are primarily GPU dependent, there are some games (and other software) that will take a pretty big hit on performance with a weak CPU. A dual-core ULV CPU would be a massive reduction in performance compared to the 3840QM in your M14xR2, Docsteel. I think you would notice it and probably not be very happy with the end result.

    We should have some answers soon... probably won't be too much longer until the web site opens for pre-orders and allows us to have a gander at the configuration options. Even though little machines like this are not my cup of tea, I would still like to see Alienware staying out in front of their competition and bowing to none of their competitors where performance is concerned.
     
  15. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    didnt think anyone would use GPU/CPU boss as reference lol
     
  16. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Sometimes they base their information on NotebookCheck, which I know is a more legitimate source. But I never refer to GPUBoss as my sole reference. Nobody should. You should always have more than one to present accurate information or an argument.
     
  17. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    I definitely agree with the "cool enough" if we are talking component wear, etc. being essentially equal.

    As for the Dual-core vs Quad, I definitely know it's a compromise in terms of performance (almost daily I move back forth between a desktop with an Extreme hexacore processor and SLI Titan's down to a lowly M11x-R3, M14x-R2 is normally for on the road work). I would _prefer_ a full on i7 personally, however, I have to be honest, even taking it on the road and using with work, I rarely do more than two to three apps at a time, or say a game with music streaming, which is probably quite a common usage pattern. ULV-dual cores handle this fine, and few games make enough decent use of more than what a 4 logical cores can provide for gaming purposes.
     
  18. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131

    Lol - the point wasn't how reliable they are or their relative ratings, it's that the two FPS counts are probably easily checked and if correct (from what I have seen in other sources they are so far), then it does point to a case where it may depend on how CPU vs. GPU bound a game is as to what difference the two GPU's are going to make. Honestly, I didn't think anyone would stoop low enough to criticize a casual reference ;)
     
  19. desiplaya130

    desiplaya130 Notebook Consultant

    Reputations:
    22
    Messages:
    158
    Likes Received:
    21
    Trophy Points:
    31
    may need to this with a grain of salt.

    I was talking to an alienware tech about a completely unrelated isssue, and the topic of new laptops came up. I mentioned the Alienware 13 that I might pick one up, and he responded Alienware doesnt make a 13" laptop. I then sent him the link in the beginning of this thread and he seemed very surprised. After looking over for a bit I said, the laptop looks great, but I don't know what that extra port in the back is for. He took a look at the picture of the back and started to giggle, and said I know what it is for. I began to ask how since you didnt even know about the laptop and he said that alienware R&D has played with that port before and thats how he knows what its for.

    I then began to pester him to tell me to which he said the calls are recorded and so couldnt disclose that information, but when I asked if it was an external GPU port or something he hesitated to say no and laughed some more. So those who guessed that it might be a port to support an external GPU, you may be right.
     
  20. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Yeah, I saw the ULV CPU... unless there will be a quad core ULV, forget it. Newer games will benefit from faster cores and more cores. Not to mention the ULV CPU's only support up to 1600MHz DDR3. Granted most of these thin gaming lightweights throttle the CPU and run pretty darn hot, but performance will still be pretty meager compared with a 35W quad core counterpart.

    I thought Alienware would have a real Razer Blade, Aorus X3+, MSI Ghost competitor, but it's still not quite the package to compete. 860m is good, but there will likely be an 860m replacement with the 900m series in the next couple of months.


    860m Maxwell will trounce the Kepler version Watt for Watt. Maxwell runs cool and can manage 1080p and less perfectly fine, and can be run near stock 870m performance with an overclock, still running less than 120W for the system and < 75C for the GPU. The ULV CPU will definitely kill any advantage though that any reasonably powerful GPU can offer. I like that the likes of Razer and MSI and Aorus are pushing the thinness limits, but they also need to step back a bit and work on a fully integrated solution and not just going as thin as possible for the sake of doing it. The Razer Blade 14 I tested was very nice, but it also would burn your legs, and possibly your fingers if you had your hand resting in the wrong spot. It just didn't make much sense.

    The AW 13 seems like they cared about thermals and allowing the system to breathe. It may be the first system to actually have been designed to be thermally optimized in the small and thin form factor.
     
    Mr. Fox likes this.
  21. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    In some quick reading on this point two game engines came up, Frostbite and Crysis. In each case performance was enhanced, but I noticed that people were confusing cores with threads. ULV processors give you four threads on four logical cores from two physical cores. I find it hard to imagine AW/Dell is not cognizant of this and it's impact on a broad spectrum of games and in preliminary testing is finding fps rates acceptable to the target audience of the design. OTOH I do agree with the fact that game engines are trending towards "more cores=better performance" so the future proofing is an issue, but again it depends on your tolerance for fps.

    I'm also not convinced that an 860m (or ULV) is all we will ever see, if the 900 series is truly that close.... again, wouldn't surprise me at all that we see a replay of the M11x-r1 followed by what AW/Dell wanted to release but the chips weren't ready, the M11x-r2, so that one slightly underwhelming version is rapidly followed by a refresh with Broadwell and a 9xxm. Of course, people will still go off on it being ULV wanting a lap toaster instead ;)

    Frankly, I am beginning to suspect the AW 13 design is a lead-in to a thinner, lighter AW 17 (and given the popularity of the 15" MBP, possibly a return of the 15), which would definitely have full on i7 quad's. AW/Dell are not fools, but they do have to produce products that both don't incur lawsuits or loads of support issues. I feel certain they see the handwriting on that wall that thin is in, but that they still need to make a reliable, heat-managed product in the process. The move to ULV *first* might be them testing the waters a bit before getting more bold with it.
     
  22. HSN21

    HSN21 Notebook Deity

    Reputations:
    358
    Messages:
    756
    Likes Received:
    94
    Trophy Points:
    41
  23. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    Eh hopefully the "docking station" isn't that big though. It looks really big. It looks interesting but lugging around a dock like that would be annoying. Makes sense if you're leaving it in one location to dock later, but for taking it around all the time, the AW18 seems a bit more convenient lol.
     
  24. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    That whole idea is weird to me. It contradicts the whole point of the AW 13 - extreme portability.

    In my opinion, having a "docking station" on a desk where you must plug the AW 13 into also doesn't seem very attractive. A 13" screen on a desk, far away from your face, will be too tiny to game on. The AW 13 is not a desktop replacement. It is supposed to be a gaming laptop, literally. Maybe they have other plans for it. Who knows...
     
  25. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,175
    Likes Received:
    17,888
    Trophy Points:
    931
    It would be with a large external display at home.
     
  26. ejohnson

    ejohnson Is that lemon zest?

    Reputations:
    827
    Messages:
    2,278
    Likes Received:
    104
    Trophy Points:
    81
    I think the point of it is that you can have the docking station, is that you can have a monitor, keyboar, mouse, and this "dock"
    When you get home you can just plug the aw13 into it and game on a big screen with high power.... then pick it up and go.

    I think its a great idea and look forward to picking up the 13 when its out.
     
    reborn2003 likes this.
  27. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    It would have to be a lot of performance for me to be interested in something like that.
     
  28. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    Agreed - it doesn't make sense at all, particulary if the statements about ULV processors are true. I'm not convinced based on that report that is, but...

    I think this is more likely. It could be even something odd, like they are mating a second GPU into the docking station to serve the kvm.... not to mention better sound outputs likely. I could dig on that actually too.
     
  29. Defengar

    Defengar Notebook Deity

    Reputations:
    250
    Messages:
    810
    Likes Received:
    40
    Trophy Points:
    41
    If reports are true, that MSI docking station will be able to house any desktop GPU... even say, a 780ti....
     
    reborn2003 likes this.
  30. Defengar

    Defengar Notebook Deity

    Reputations:
    250
    Messages:
    810
    Likes Received:
    40
    Trophy Points:
    41
    I sincerely doubt the AW 13 will be using an external GPU dock though. There is zero actual evidence of that being the case, and tons of evidence pointing towards it having a GPU inside.
     
  31. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    My thoughts exactly, Doc. The processors in these tiny 13" systems will bottleneck the heck out of a 780Ti, lol. This whole "dock" idea is hilarious.

    Just noticed: "Doc" and "Dock", lol. :D
     
  32. ADOR

    ADOR Evil Mad Scientist

    Reputations:
    520
    Messages:
    1,949
    Likes Received:
    210
    Trophy Points:
    81
    With the ASUS XG station I have it has a 8800gt in it, and I can upgrade it. Just haven't done it. (one of these days I am thinking about trying a 750ti in it) It's limited to the bandwidth of the express card slot so the card can't play at full potential, BUT it does boost the ablility to play games on any laptop with a express card slot. I wanted to play with them when they were due out but they were only sold in the land down under. Then they quit selling them. About 2-3 years ago one turned on on ebay for about 150 bucks. I got it just to experiment with. It does expand what you can play at home, but still isn't a replacement for a full on gaming desktop. But it isn't meant to be.

    I think a external dock that can expand what you can play would work, then take the same machine with you with all your files already on it. For me something like this could work because of the way I travel. Oil rig worker in Saudi Arabia. That is why I am toting the R1 M11x with me now. We have weight restrictions on the helicopter and they are firm about it. So something like a 17/18 would be out of the question weight wise right now. I know my old M9750 would be a no go on the flight.

    I would like to see a little stronger CPU in it though, at least as a option. Just waiting to see what they turn out.


    Older single core thinkpads have a advanced dock that has a pci-e desktop card slot in them. I think they stopped it when the dual core came out. Several people modded them to better cards, not just quadro type cards.

    Looking at the new MSI dock it looks massive. The thinkpad one was a lot shorter but the pc went on the top. My XG Station will fit a lot of places as the only connector from it is the express card slot and it could be hidden if needed.
     
    reborn2003 likes this.
  33. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,624
    Trophy Points:
    931
    Well, what if it has BOTH... that would be way better than the competitors' offering if they have only Intel HD graphics. If they release this with an i7 so it doesn't bottleneck the crap out of everything it would be pretty unique to have such a small machine with so-so gaming performance on the go and a 780 Ti it could plug into at home. I'd never consider buying something like that, but the thin and light fans would probably eat it up. Having no decent gaming option except when sitting your buns at a desk would suck... might as well just buy a cheap Chromebook and build a desktop for gaming it you have to be tethered. This could bridge that gap, but it needs an i7 to pull it off reasonably well.

    I have to agree with you... I've never personally thought having a laptop with an eGPU was anything to get excited about. It's a cool geeky thing for sure, but it's like compromising too much in both directions to me... not quite what you would expect from a desktop or bonafide DTR, and a counter-intuitive limitation for a laptop that could have just been made more powerful to begin with. But, doesn't matter what I want or think if Alienware makes money on it... more power to 'em. There's probably not going to be anything left that I'm interested in owning any more the way laptops are headed with compromises in performance for the sake of smallness. I suppose that will be the new bragging right.

    "My laptop is smaller than yours, nah, nah." *sticks out red Kool-Aid stained tongue.*

    "Yeah, well, mine gets 2 FPS more than yours in Flappy Bird, and gets an extra 5 minutes of battery life."​

    "Does not!"

    "Does too!"​

    :D
     
    HTWingNut, TBoneSan and J.Dre like this.
  34. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    If DTR's keep getting neutered I may be a unique position for eGPU's to actually suit my circumstances. But there are caveats They really need to saddle that tiny laptop with a CPU that can stand the test of time and a more generous PSU than the 65 watts it's got right now. 15" screen. That dock also needs to be offer SLI too otherwise I'm going backwards...can't have that! :D

    I actually get the idea of a docking station since I can take the laptop section to work or short trips where I tend not to use horsepower anyway. Then sit it on (near?) the gpu's dock when ever I game at home which is usually in the same spot. I could get used to that.

    Things not to like is.. it's more of a headache to travel/move about with if I actually want to game, bench or simply if I generally enjoy keeping all my faculties intact. The convenience factor is only good for as long as you don't want to use any horsepower. Hmmm...
     
    Mr. Fox likes this.
  35. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,175
    Likes Received:
    17,888
    Trophy Points:
    931
    I'd still want a little GPU on the move, an 850M DDR3 class device would do, make it be able to clock up on battery as much as possible. With an external dock, let it support a closed loop with the radiator outside the enclose and I would be happy lol.
     
  36. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Yeah, if they are going to bother with having an external device they might as well let people have their way with it. Hopefully they don't go do something dumb like stick a propitiatory 200 watt PSU inside.
     
    Mr. Fox likes this.
  37. Defengar

    Defengar Notebook Deity

    Reputations:
    250
    Messages:
    810
    Likes Received:
    40
    Trophy Points:
    41
    I hope you realize that desktop GPU's outclass their laptop named counterparts by a pretty significant margin. Just because you are going down to one, doesn't mean you are losing out.

    Only dual overclocked 880m's can compare to the performance of one NON overclocked 780ti. Also if you have 1 GPU you don't run into the issues that can sometimes crop up with SLI (screen tearing, some games not recognizing second card, etc...)
     
    Mr. Fox and TBoneSan like this.
  38. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,624
    Trophy Points:
    931
    Single or multi-GPU boils down to one thing only... user preference. OK, maybe two if we count budgetary constraints.

    The only problems I ever have with SLI are caused by lazy game developers doing a sloppy job at their trade, LOL. Even so, when you have to deal with that kind of incompetence you just turn it off until they fix it, or get rid of the game if it runs bad even with a single GPU. It's not a problem to deal with either way. GTX 780M SLI holds its own against 780 Ti. It spanks pretty much all of the lesser AMD and NVIDIA desktop GPUs in a single GPU configuration, as well as many, if not all, of the more budget-friendly desktop GPUs in an SLI or CrossFire configuration.

    It does mean you're losing out if you don't want one GPU. I would call it losing out if it was for me. I would not want a desktop or a laptop with a single NVIDIA GPU any more. I think SLI is just way too excellent to live with only one GPU. CrossFire isn't always great, but CrossFire bugs are not a valid reason to avoid SLI. For me there is no point in spending a dime on a desktop that doesn't stomp my M18xR2 and it takes an expensive desktop configuration to accomplish that. You cannot build a more potent desktop on a lunch-money budget.

    I haven't encountered the sound card issue before... I saw where a single GPU laptop had that problem when upgrading to a GPU it was never intended to have, but I have not seen it directly attributable to a multi-GPU setup. Are you speaking of an IRQ conflict or something like that? I had run into that a very long time ago, but it was a inexpensive and buggy motherboard in an older desktop.
     
    TBoneSan likes this.
  39. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    It will be interesting if AW/Dell actually tries an external docking station with a GPU to takeover, but it's utterly pointless with ULV chip on the laptop, even in my opinion on them not being altogether a bad thing for light-and-on-the-run gaming. Like one poster said too, you have to watch AW/Dell on the power supply end of things as well, I could easily see them pulling what they did with the Area 51's with a proprietary cable (way to be cheesy and greedy Dell).

    I actually have more hope now that a full-on i7 is possible with the AW 13 if an external GPU is in the cards... as cool as it is my desktop would still trounce it, so it's not for me, but I laud them for doing it if it works out.

    I could also see Dell doing this: outfit a closed environment docking station so it is a fixed card to match what is in the laptop and it becomes some mutant form of SLI, all nice and proprietary the way they like it. Not sure if a ULV-processor even with SLI 860m's wouldn't still be CPU-bottle-necked though, anyone know?
     
  40. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681

    There's not a whole lot I can add to what Mr Fox has so eloquently put. I've found my 780m SLI overclocked to bring the fight to a desktop 780ti even overclocked, only to loose out to first page of the top 100 on a 3D Mark 11 leader board.
    If I were changing machines I'd want it to be a meaningful upgrade. I don't see the point in otherwise. If I'm not moving forward, I'm going backwards. So going to such a device would need to let me have my way with SLI to make it worthwhile.

    Single GPU tends to work out the box for games better but I've always been able to fire up both my cards one way or another and have had a lot of satisfaction doing so. So performance is still what I'm lusting after here as I'm yet to have any sound issues what so ever from sporting SLI.
     
    Mr. Fox likes this.
  41. Kirrr

    Kirrr Notebook Deity

    Reputations:
    253
    Messages:
    901
    Likes Received:
    39
    Trophy Points:
    41
    I make a bet: Not ULV cpu and no dedicated card. BUT external solution with an MXM gpu. What can I win?
     
  42. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    The dissatisfaction of being wrong. :D
     
    Mexic00ls likes this.
  43. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    Yeah - I think all the prototype units around are sporting 860m's on board, so them not having an onboard GPU, that is definitely not Alienware :)
     
  44. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    They will have a 860m or dedicated gpu for sure. Every hands on review so far that got to play with the AW13 said it had a 860m maxwell gpu. The ULV cpu is the component that is still not 100% confirmed.

    A 13 inch laptop with a 860m and the ability to use an external gpu would be a very interesting product. The 860m alone should do great at 1080p. I'm sure we'll see a broadwell/960m model in 2015 as well, which should give the laptop a nice bump in speed.

    It would be interesting if the port on the back is a proprietary alienware dock port with thunderbolt 2 built-in. The dock could have gigabit ethernet, usb3 ports and the ability to use a external gpu(maybe even external displays). That would be a winner in the dock market. Take the laptop on the go with decent performance and hook it up for maximum performance. The only problem with that would be an ULV cpu bottlenecking a high end gpu like the 970/980 etc.


    As for a single desktop gpu being more powerful than two laptop gpus, that makes perfect sense. We have two laptop gpus with a total of 200w tdp being able to compete with a single 250w desktop card. The laptop solution uses less power and adds a huge portability factor. The desktop 980 is rumored to have a 170w tdp and be 10% faster than a 780 Ti. If that turns out to be true, we're talking about a small 10% performance increase, but a nice 32% power savings, which will help shape the 980M to be a powerful gpu. I can see the 980M being a full 980 with just reduced clocks.
     
  45. Kirrr

    Kirrr Notebook Deity

    Reputations:
    253
    Messages:
    901
    Likes Received:
    39
    Trophy Points:
    41
    Okay. 860m then...
     
  46. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Not really, considering a 780ti can smoke an 880m SLI rig, it's not really a need.

    I like the idea of the dock, but personally, it wouldn't work for me. I want the performance wherever I go, even around the house. I guess if they offer an optional dock for eGPU but still include 860m (or soon to be released 960m), it would be OK. Give you reasonable performance on the go with no compromise at home for those that want it. But a ULV CPU would not cut it for a high end GPU anyhow. Minimum full voltage (i.e. 45-47W) i7 mobile quad.
     
    Mr. Fox and TBoneSan like this.
  47. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    that would be an interesting thing. that was something that I expected from the msi gs30. however it wasn't delivered at all.
     
  48. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    As far as 780m SLI goes I wouldn't say a 780ti smokes it at all, beats it yes. 880m more so again. Have a look at my benchmark links on the previous page.
    I agree that dock doesn't need SLI. That need is more of a personal requirement in my lust for overkill.
     
    HTWingNut and Mr. Fox like this.
  49. Nereus333

    Nereus333 Notebook Consultant

    Reputations:
    136
    Messages:
    268
    Likes Received:
    72
    Trophy Points:
    41
    I don't see the point in an external GPU... just get a bigger laptop with internal GPU instead.

    Also there's talk of NVidia skipping the 800 desktop series altogether and going straight to 900 series with new architecture and releasing in October, the reason being to synch the architecture between the desktop and mobile GPU series, so who knows, perhaps we'll be seeing a 900M series sooner than expected too, with new architecture. Wouldn't that be nice... and surprising, considering AMD aren't exactly forcing NVidia into it...
     
    Mr. Fox likes this.
  50. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,624
    Trophy Points:
    931
    Yeah, it's a fancy gimmick a few of the OEMs are experimenting with. I'm with you, though... either make something truly awesome that stands on its own killer performance abilities without having to resort to morphodite contraptions, or just forget about it. Having a powerful desktop GPU in a box sitting on a desk that is not portable won't make up for a featherweight "gaming" system that has overall poor or mediocre performance. They are trying to compensate for a lack of something, and that seldom works well. It should stand on its own as a high performance laptop, sink or swim. If it can't, then sell it based upon what its own hardware specs tell us that it actually is... a thin and light system that is capable of playing games as long as the graphics settings are conservative. Nothing wrong with that if you want it, buy it. But let's not pretend it's something it's not by making a FrankenPC out of an Ultrabook.
     
← Previous pageNext page →