The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Alienware 13 Pre-Release Speculation Thread

    Discussion in '2015+ Alienware 13 / 15 / 17' started by tinker_xp, Aug 8, 2014.

  1. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    I find it odd that after for some time, the peak power of laptops will regress rather than progress. That just doesn't make sense.
    There has to a medium where laptops can be powerful, with of course those heavy enthusiast SLI/Crossfire laptops. Ignore that, and the power of a laptop actually goes down for some time rather than going up. It doesn't suit me well that the best offered is a dual GTX 970m or something of equal in a thin laptop. That's not high-end, that's just a cash-in for an inevitable purchase from the brand. Or loud complaints, whichever comes first.
    I won't understand how intentional gimped BIOSes and the arrival of the 13 or anything similar means that the 18 could be at the EOL door, especially since Alienware just revived the Area 51. I think they still know there are power users out there, but if the 18 is kicked out, I doubt it would be because of a market trend. They lose a chunk of their customers, and its a lose-lose.
     
    Mr. Fox likes this.
  2. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    They are taking a step backwards (performance wise) in order to move forward. This next generation of hardware may not double performance, but efficiency should be improved greatly. This will lay the groundwork for future generation hardware and allow them to really cram in performance. Maxwell is basically a sneak-peak into what's to come. Pascal will probably blow Maxwell out of the water. It should be just like the 580M to 680M jump in performance. I think Rob said that 14nm should be able to cram 130% more cores than 22nm or 28nm? Something like that... I don't know much about that but it's a huge difference. Don't quote me on that.
     
  3. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,624
    Trophy Points:
    931
    I really hope I am wrong. And you're right... that approach doesn't make any sense.

    Well, it might make sense if to someone whose idea of "high performance" is a smartphone with a 1.5GHz processor that can stream Netflix and play Angry Birds or Garden Warfare in 1080p. :laugh:

    I have a problem with that "metered" approach, at least in theory. If they can produce the same performance with less power, that's still not a good excuse for producing the same performance. Instead, they should consume exactly the same amount of power and produce the same amount of heat while yielding a massive increase in performance that eclipses anything we have had before. In other words, fully exploit every resource currently available rather than settle on an "efficient" product. They already have the big heat sinks and big power supplies, so to not use them and take advantage of the situation to merely downsize the awesomeness without losing any ground on performance would be criminal. I find it hard to get excited about something that merely promises to redefine the status quo.

    The idea of having something small and energy efficient tomorrow that has performance that matches something big and power hungry today is still a major cop-out in concept. Real progress would be to keep the big footprint and power hungriness in proportions exactly as they are now and produce an insane level of performance, consuming equal amounts of power, that goes far beyond anything imaginable today. Maintaining today's performance with less power draw in a smaller package is just a slick marketing way of selling the status quo in a new wrapper... and I bet they will charge the same price for it, too.

    Efficiency should only be valued (to my way of thinking anyway) when it creates more headroom... headroom to do A LOT more with the same; not remain static or achieve a little more with a little less. It might be an OK concept for a corporate Human Resources Director as a model plan for reduction in workforce, but in the performance-centered computing world, efficiency should never be valued for its ability to achieve the same results with less. If they can make a single MXM GPU that peforms like 780M SLI, that's really nice and I'm happy for them. But, my response would be for them (the OEMs) to just shut up and stop cackling about how "efficient" the new one is today, or that just one of the "efficient" GPUs is equal to two of yesterday's beast GPUs. I'd tell them to hush up and give me two of those new GPUs in SLI and then we'd be talking about something worth dropping a wad of cash on... but, only if I can overclock the crap out of it.

    It would be sort of like, "Hey look... this little car, Car A, can go 150 MPH on half the fuel and 1/3 less cubic inches as Car B." But, my response would be, "So what... whoopy-doo, both can only go 150 MPH, so you have not really accomplished anything. Come see me when Car A can go 250 MPH, but don't tell me about how little fuel it takes to match the performance of Car B." You can always fall back on efficiency if you need to take advantage of that in a crisis situation where efficiency matters, but it shouldn't be the end game.
     
    UltraGSM and TBoneSan like this.
  4. TomJGX

    TomJGX I HATE BGA!

    Reputations:
    1,456
    Messages:
    8,707
    Likes Received:
    3,315
    Trophy Points:
    431
    And it wouldn't matter to me and most people with 17"... If Alienware and Clevo go, Asus will still be around.. I loved their G73JH and it had way less issues then this Alienware... I regret not waiting and getting the G751 now..
     
  5. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    And there is it! :D

    I can agree with pretty much everything you said, in one way or another, from a consumers point of view. But put their shoes on for a minute. Think about the number$. They invested just as much (probably more) money in Maxwell to make it more efficient as they did on the 680M to double performance over the 580M. (The reason I reference the 680M is because it was the last major jump in performance.) So, it would take twice as much money to make it both perform twice as fast and be twice as efficient. Why spend twice as much money when they can invest the same amount of money, make it more efficient, give up to a 40% performance boost, and sell it for years to come? For starters, AMD surely isn't giving them any good reason to move forward and invest in more R&D. It has also been hyped up so much since the 780M was released, people will buy it no matter what.

    The hard truth is: their main goal is to remain profitable. They are not making products with the sole intent to satisfy the every whim of the consumer or overclocker in the world. Even I am disappointed with this but it's completely out of our hands. Plus, why would they invest more when their competition is slacking off? :rolleyes: Competition is always good. I guarantee you they'd have a better card if AMD had better cards. If you think about it, what they did with Maxwell (assuming the rumors have been true) will be impressive, nonetheless. It will still have a decent boost in graphics performance over the previous 780M and 880M. I believe early benchmarks [unofficial leaks] suggested that it will perform at least 40% better. That will pretty much destroy whatever AMD has planned and leave NVIDIA on top with nearly 70% market capitalization.

    My point was basically to suggest that Maxwell is the foundation of generations to come because of how efficient it is. Now that step is out of the way, they can focus more on performance. Now they have no excuse to avoid what we so desperately want! :cool: I'm sure Pascal will be running laps around Maxwell, leaving Kepler in the dust, making the 680M/780M/880M (Kepler brothers) obsolete in that regard.



    In other news, the 980M and 970M have been officially added to the NVIDIA website. :thumbsup: Nvidia GTX 980M brings even greater Maxwell energy savings to gaming laptops.

    [​IMG]

    This means that Alienware can now stop screwing around and stick a better GPU in the Alienware 13. The Alienware 13, at the very least, should be able to handle an 870M with 6GB's of VRAM just like the Aorus X3. I sure do hope they offer many upgrades on their website for this new system.
     
    reborn2003 and Mr. Fox like this.
  6. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,624
    Trophy Points:
    931
    Great post, Brother J.Dre. :thumbsup: I think you've got 'em pegged.

    Now, let's just hope I'm wrong about the 18.
     
    reborn2003 and J.Dre like this.
  7. hypervenum

    hypervenum Notebook Geek

    Reputations:
    0
    Messages:
    89
    Likes Received:
    22
    Trophy Points:
    16
    just missing that cost less than the gtx 780!!!
     
  8. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    I'm worried about all this HQ processor crap...

    A lot of manufacturers are using soldered CPU's now coupled with the new Maxwell GPU's. If Alienware does this with their 17 and 18, I am moving to desktops.
     
    TBoneSan likes this.
  9. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,624
    Trophy Points:
    931
    Yeah, I won't put up with soldered CPU or GPU garbage. That's pure trash, and it's one of the reasons that the Ultrabook form factor is garbage. Why does it even exist? Because [insert derogatory here] people are willing to accept it and pay money for it... style over substance; form over function.
     
    TBoneSan likes this.
  10. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    Eh the 900M series seem to be intentionally reduced in performance to keep it further from the desktop models.

    The 980M only has 1536 cores, but I'm sure nVidia could have put in at least 1664 cores like the 970 desktop part.

    This only means the 960M will have 1024 cores or less at most.

    A 1024 core 960M still wouldn't be bad if it makes its way onto the AW13.

    I just feel nVidia could of brought some more performance out of these M cards. Maybe they'll overclock insanely proving they could have been faster from the start. :thumbsup:
     
  11. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Yeah but 1536 Maxwell cores perform much better than 1536 Kepler cores. 780M vs 980M is a perfect example of this.

    NVIDIA's focus was efficiency with Maxwell. Pascal will be about performance.
     
  12. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    Let's be honest.... 98% of people buying a laptop have absolutely no intention of upgrading their GPU or CPU.... to deride soldered CPU/GPU's (of which I am personally not a fan), is ignoring what makes sense in the market space, enthusiast be damned. It's easier to maintain from a scale perspective with companies like Dell and it makes no difference whatsoever to end users in the vast majority of cases.

    To deride the market as as"willing to accept it" is to ignore the fundamental reality that users simply don't care, they have no need and more accurately no desire to. It's not that they are simply stupid, or blissfully unaware, even if you tell them that it's to their long term advantage to have a non-soldered component most have no interest and never will... and that's a fact. In truth, powerful, modifiable laptops (and eventually desktops) are probably going to become oddities one day sad to say... as technology becomes "good enough" it will be simplified and commoditized.... not saying I like it, but that is just a fact but it shouldn't be taken as "well, people are just retards" for not wanting heavily customizable systems", it's just not that simple an assessment. Disliking a market trend is one thing, but just making gross negative generalizations to justify it ignores facing the real facts of the matter and distorts the question.

    Now, I am with you on the question of form over function to a point on the trend towards ultrabook designs, but it's important to bear in mind its not just a style question, it's truly a portability/ease of use question which IS function too. Laptops should not be thicker than necessary, but also should not be so thin as to considerably impact structure or heat dissipation during a reasonable lifetime. Makers like MSI, Nvidia, etc are catering to the "turn 'em out, burn 'em out" philosophy, and Dell/AW has bucked this trend but too far to the extent they are not seeking to reach the proper middleground either which is why we are seeing the AW 13 take shape. I don't buy into the line that they will dump it in a year as this design favors the market more than the AW 14, but I do believe Dell/AW is committed to this design footprint that if the clamor is enough over the dual-core that we will see a quad, perhaps as early as the 2nd revision.
     
  13. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    They do overclock pretty well, at least the desktop cards. My own Gigabyte 970 easily does +160 core for a whooping 1506 MHz boost without even touching the voltage.
     
    reborn2003 likes this.
  14. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    Here's the thing. This generation is definitely supposed to be about power efficiency, but let's take a look:

    780M had 1536 cores with a 100W TDP. GTX 680 had 1536 cores @ 195W TDP.

    980M has 1536 cores @ 100W TDP. GTX 980 has 2048 cores @ 165W TDP & GTX 970 has 1664 cores @ 145W TDP.

    By this logic, the 980M could have easily had 1664 cores, maybe even 1920 cores. What makes this even more logical is that the mobile parts have a 5ghz memory clock at a lower voltage vs the 7ghz memory clock on desktop models. That also reduces heat/power consumption.

    I don't know, I really feel nVidia did this on purpose. Wouldn't surprise me if we see a 980MX or 985M in a few months, especially if AMD releases their R9 M295X and it happens to match a 980M(probably won't).
     
    reborn2003, Mr. Fox and TBoneSan like this.
  15. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    As a (sometimes reluctant) fan of Nvidia over AMD (tried them twice, bad experiences both times), I have to agree on the surface it looks like they are doing one of their rounds of misdirection, with another product release in the middle. Just looking at your tagline, they kind of played around with the Titan release... while it's truly a good card (I have several as well), the later release of the 780 not terribly long after shows they had a plan all along. The Titan was really there to test the bounds of the market and establish a new higher price point people would accept for ultra performance. Establishing a newer but not substantially better card does the same thing, setting the price/performance point and then allowing another product to enter the mix, only this time at a likely higher cost or generate demand with equal performance at a lower cost..
     
    holytoledo951 likes this.
  16. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    That's the most likely scenario... Not a few months, but probably Q2 2015. They did do it on purpose.

    It's all about the money. If it's a public company, you can pretty much predict every move they'll make because they have only one goal.

    [​IMG]
    Hmm, so the Alienware 13 is today's topic... :D
     
  17. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    Speaking of the 13, has there been any new news on it? I can't seem to find anything. Seems like they're going to be quiet about it until release.
     
  18. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Yeah, been really quiet. All of Alienware has this year. Last year they were throwing stuff in our faces.
     
  19. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    They did announce a bunch of products this year though such as the alienware 13, the alpha console, and the area 51 triangle shaped desktop.
     
  20. matt-helm

    matt-helm Notebook Consultant

    Reputations:
    28
    Messages:
    168
    Likes Received:
    13
    Trophy Points:
    31
    I wish they would announce something soon... I'm standing here with my money trying to give it to Dell.
     
    reborn2003 likes this.
  21. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    They like to launch on the 10th of the month.
     
  22. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,624
    Trophy Points:
    931
    Early adopter... you're a very brave soul. :p

    Seriously, I hope it turns out to your satisfaction.
     
    reborn2003 likes this.
  23. hypervenum

    hypervenum Notebook Geek

    Reputations:
    0
    Messages:
    89
    Likes Received:
    22
    Trophy Points:
    16
    agree with mr. fox but remember that in any case there is always the 14 and when it comes out on the 13 the price of 14 will come down
     
  24. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Yep my first time being an early adopter and I get burned (Gigabyte 970s with ridiculous coil whine). Never again.
     
  25. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    As much as I want to move on one, I am planning to hold off until the revision with Broadwell, and see if there is a revision to the AW 14. Depending on where they go currently on those points will determine when and if I pick one up.
     
  26. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    I hope they offer more than that old 860M. I think it would be completely acceptable if they offered the 860M on base models, and an 870M 6GB as an upgrade option. If so, the 1600p screen may not be that bad after all. Otherwise, 1080p is the highest I'd recommend to anyone for this system, especially if they're purchasing for the sole purpose of gaming on a portable platform. A 2GB 860M is such an obsolete GPU at this point for 1600p gaming. That combination is not even worth purchasing unless your budget is tight.
     
    reborn2003 and Mr. Fox like this.
  27. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    If they can put an 870m in there, they can put a 970m in there.
     
    reborn2003 and Mr. Fox like this.
  28. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Does the 970M come soldered? If so, then yeah, you're right.

    But if they keep the AW 14, they probably won't, even if they technically can.
     
    Mr. Fox likes this.
  29. Brynhild

    Brynhild Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    9
    Trophy Points:
    16
    I REALLY hope they keep the 14" and put an 970m in there, that's what I've been waiting for a long time :GEEK:
     
  30. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    The 970M will come as a soldered option just like the 870M did, so I can see the 970M in a 14. For the 13 probably just a 960M or the 860M.
     
  31. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,624
    Trophy Points:
    931
    Well, notwithstanding the impact on performance, based on personal experience with a couple of other laptops with 1440p and higher resolutions on a screen under 17" I can say that this is miserable to use unless you crank up the display scaling to about 150%. Increasing scaling above 100% defeats one of the purposes of having more than 1080p. At 125-150% scaling, text starts to become out of proportion with other window elements and even causes text truncating in menus and dialogue boxes. At 100-125% scaling the text is too small for comfortable viewing on less than a 17" display even though the higher resolution looks much better. My suggestion to anyone that decides to order the new AW13 (or any other Ultrabook with more than full HD) would be to think long and hard about going with anything higher than a 1080p display resolution, especially if you have never experienced 1440p or higher on a 15" or smaller display... you may regret it. In fact, if you have good eyesight, 1080p is probably ideal for a 13" or 14" display. If you have poor eyesight, don't even think about it.
     
    reborn2003 and n=1 like this.
  32. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Very well said. I don't have the best eyesight, so even 1080p at 17.3" starts to hurt after a while. I don't even want to imagine what 4K at 15.6" or smaller is going to look like.
     
    Mr. Fox likes this.
  33. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    I've had a 3200x1800 resolution on a 15.6" display and let me tell you, windows display scaling (even in the 10 technical preview) sucks! Until Microsoft gets their scaling fixed and developers actually implement hidpi aware apps, it will always be better to choose the lower resolution(not 1366x768 but something decent like 1080p).

    1080p @ 15.6/17.3/18.4" is fine. I haven't seen it on 13.3/14.1" yet but I can imagine it being okay.

    The 1080p 13.3" panel will probably be the most popular option on the 13.

    I'm hoping the 1080p panel will be of the high quality IPS variety with good color reproduction. Hopefully we won't have to choose the 2560x1440 option to get the good ips panel with good colors. I think the 1366x768 panel will be a TN panel.

    The 860M should handle the 1080p fine for most games, but will struggle for sure on more demanding games. Hopefully nVidia will have a 960M or 965M ready by then. They're supposed to be announcing a 960 desktop model this month I believe.
     
  34. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Are you talking about the size of the icons & text? Alienware likes to increase this to 125% already, even with their current systems. I don't think it would be that much of a problem on a 13" system, considering this can be done. I was referring to gaming and the sacrificed FPS with the higher resolution screen, claiming that the 860M would be a horrible choice.
     
  35. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    The problem with that on a hidpi display where the screen is small and the resolution is huge, is that the text will be unreadable. It will be so tiny, you'll need a magnifying glass to read. It isn't a problem with the current Alienware systems, since the display size and resolution are in a sweet spot or so. On the 13", the 2560x1440 resolution @ 100% scaling would be very small.
     
  36. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    What else would you expect? I don't understand your point.

    I already said 1080p is the best option. ;)
     
  37. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    I'm pointing out what Mr. Fox said about the high resolution on small displays. How you will be forced to run a 150%+ scaling and windows not doing a good job at it. Lots of apps even with the 150% scaling will either be too small or will be blurry because they are not hidpi aware. You have to have experienced this first hand to understand how bad it is. It will be uncomfortable to use. See post # 531.
     
    Mr. Fox and n=1 like this.
  38. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,624
    Trophy Points:
    931
    Well, historically it has been 125% scaling with 1080p. And, 125% is already tacky enough on 1080p... 100% is much better and gives you a clean, undistorted view of the Windows UI.

    But, that isn't the issue. Scaling at 100% or even 125% is not sufficient for text to be large enough for comfortable reading at 1440p on a screen smaller than 17" and everything starts getting all jacked up really bad above 125% scaling. We're talking icons, icon text, Windows menus, dialogue boxes, messed up text not being able to fit within open windows of third-party applications... everything goes all screwy above 125% and to fix it by dropping the scaling everything becomes too tiny to read on a small screen... been there, done that... no thanks. Yes, 1080p is the best option on a 15" or smaller screen.

    Edit: and it's much worse with Windows 8 than Windows 7 because there are comparatively few UI adjustments in Windows 8. You have to resort to registry hacks and third-party utilities to change some things that Windows 7 (and previous versions) made as as easy a pie using native Control Panel tools that Micro$haft conveniently omitted from Windows 8.
     
    n=1 likes this.
  39. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    oh god the horror that is 150% scaling *shudders*
     
    Mr. Fox likes this.
  40. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    With a 13" laptop you're probably going to have the screen in your face most of the time. Is it really that much of an issue for you guys? I've never thought it was. The laptop I have experience with is the Macbook Pro 13" Retina. The screen wasn't that bad. I'd hate to game on it, but work was fine.

    EDIT: Is this a Windows specific issue? If so, then I have no first-hand experience with that.
     
  41. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,175
    Likes Received:
    17,888
    Trophy Points:
    931
    It comes down to options, at least have a high res and a low res option so people can get what they want.
     
  42. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    Yes windows only, probably why you don't see it as an issue like we do lol,

    Mac OS X actually scales very well, so it isn't a good example of what we're talking about.
     
  43. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Alienware better get the ball rolling. It has been more than a year.
     
  44. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    Think I found something in regards to the pre-ordering of the 13. Taken from here.
    Seems as if sometime before 11/3 we'll be able to order the 13, possibly even before PAX AU.
     
    woodzstack likes this.
  45. Brynhild

    Brynhild Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    9
    Trophy Points:
    16
    Great finding! I had already accepted that I would have to wait until PAX to hear some news about it, this brings back some hope :p
     
  46. matolati

    matolati Notebook Consultant

    Reputations:
    0
    Messages:
    141
    Likes Received:
    10
    Trophy Points:
    31
    If 13 is that close, then probably 900m will be released on the 14 as well in the same date, don't you guys think?

    I guess it make sense
     
  47. woodzstack

    woodzstack Alezka Computers , Official Clevo reseller.

    Reputations:
    1,201
    Messages:
    3,495
    Likes Received:
    2,593
    Trophy Points:
    231
    Do you think they deliberately planned the flaws in the AW 18 so that we wouldn't expect as much performance from the other models, and be content with things like the 17 having 120hz screens and the 14 being portable and lasting long on its battery etc... But being planned for discontinue slowly, they screwed up the 18 to the point people will not want it anymore, or its line, not its refresh or anything.
     
    Mr. Fox likes this.
  48. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    If this is true. I doubt it would have been malicious intent by Dell to kill off the 18, so much as they just didn't give a crap to address the issues since a refresh wasn't planned. That and they don't care about being the best of the best anymore.

    However, I think AW will refresh the AW18 with new 980m's. They've already invested in the face lift, design and architecture, so it's relatively cost effective for them to send them out for another year sporting 980m's. If they take too much longer making this happen I'd be inclined to agree with Mr Fox's gut feeling.

    I wouldn't be disappointed if a Dual GPU 17 came to market, so long as it wasn't a scorching hot soldered gimp.
     
    Mr. Fox likes this.
  49. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    We'll probably get refreshed 17s/18s with the 980M/970M. Then during 2015, we'll get the R2 with the HM97 chipset and broadwell processors. They can change the design or get rid of the 18 after that. They wouldn't change things so soon. The Alienware 13 is a good example that the current design is staying for a bit. If anything all they'll change is the power button to the old school alien head.
     
  50. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,624
    Trophy Points:
    931
    It wouldn't be entirely illogical to think that way in 20/20 hindsight, and the really poopy outcome that may be about to unwind itself in our colons would definitely support the basis for a conspiracy theory.

    But yeah, I don't think they did anything deliberately. In fact, not doing things deliberately is probably the primary reason the 18's performance isn't a whole lot better than what it is. I think things were rushed; Compal probably operated largely on assumptions based on historical engineering data and past experience; and ultimately, there was far too much left to chance. Not enough research (maybe no research) went into the maximum power requirements that 18 overclockers might encounter with a Haswell XM and dual 780M, so we see the artificially low functional limitations that this wounded warrior is hobbled with.

    Had they known better, I think they would have either provided a much higher capacity AC adapter or, to avoid having to go to any effort to do better, they may have just scrapped their plans for building another dual GPU machine before the 18 was ever released. If they do kill off the dual-GPU beast now I think it will be due to engineering laziness, not wanting to invest money to design something truly excellent and an absence of commitment to the customers that want a beast... but not anything malicious. I think they prefer shooting fish in a barrel, and the mickey mouse products are just easy money for them.
     
    Ashtrix and TBoneSan like this.
← Previous pageNext page →