I find it odd that after for some time, the peak power of laptops will regress rather than progress. That just doesn't make sense.
There has to a medium where laptops can be powerful, with of course those heavy enthusiast SLI/Crossfire laptops. Ignore that, and the power of a laptop actually goes down for some time rather than going up. It doesn't suit me well that the best offered is a dual GTX 970m or something of equal in a thin laptop. That's not high-end, that's just a cash-in for an inevitable purchase from the brand. Or loud complaints, whichever comes first.
I won't understand how intentional gimped BIOSes and the arrival of the 13 or anything similar means that the 18 could be at the EOL door, especially since Alienware just revived the Area 51. I think they still know there are power users out there, but if the 18 is kicked out, I doubt it would be because of a market trend. They lose a chunk of their customers, and its a lose-lose.
-
They are taking a step backwards (performance wise) in order to move forward. This next generation of hardware may not double performance, but efficiency should be improved greatly. This will lay the groundwork for future generation hardware and allow them to really cram in performance. Maxwell is basically a sneak-peak into what's to come. Pascal will probably blow Maxwell out of the water. It should be just like the 580M to 680M jump in performance. I think Rob said that 14nm should be able to cram 130% more cores than 22nm or 28nm? Something like that... I don't know much about that but it's a huge difference. Don't quote me on that.
-
Well, it might make sense if to someone whose idea of "high performance" is a smartphone with a 1.5GHz processor that can stream Netflix and play Angry Birds or Garden Warfare in 1080p. :laugh:
The idea of having something small and energy efficient tomorrow that has performance that matches something big and power hungry today is still a major cop-out in concept. Real progress would be to keep the big footprint and power hungriness in proportions exactly as they are now and produce an insane level of performance, consuming equal amounts of power, that goes far beyond anything imaginable today. Maintaining today's performance with less power draw in a smaller package is just a slick marketing way of selling the status quo in a new wrapper... and I bet they will charge the same price for it, too.
Efficiency should only be valued (to my way of thinking anyway) when it creates more headroom... headroom to do A LOT more with the same; not remain static or achieve a little more with a little less. It might be an OK concept for a corporate Human Resources Director as a model plan for reduction in workforce, but in the performance-centered computing world, efficiency should never be valued for its ability to achieve the same results with less. If they can make a single MXM GPU that peforms like 780M SLI, that's really nice and I'm happy for them. But, my response would be for them (the OEMs) to just shut up and stop cackling about how "efficient" the new one is today, or that just one of the "efficient" GPUs is equal to two of yesterday's beast GPUs. I'd tell them to hush up and give me two of those new GPUs in SLI and then we'd be talking about something worth dropping a wad of cash on... but, only if I can overclock the crap out of it.
It would be sort of like, "Hey look... this little car, Car A, can go 150 MPH on half the fuel and 1/3 less cubic inches as Car B." But, my response would be, "So what... whoopy-doo, both can only go 150 MPH, so you have not really accomplished anything. Come see me when Car A can go 250 MPH, but don't tell me about how little fuel it takes to match the performance of Car B." You can always fall back on efficiency if you need to take advantage of that in a crisis situation where efficiency matters, but it shouldn't be the end game. -
-
I can agree with pretty much everything you said, in one way or another, from a consumers point of view. But put their shoes on for a minute. Think about the number$. They invested just as much (probably more) money in Maxwell to make it more efficient as they did on the 680M to double performance over the 580M. (The reason I reference the 680M is because it was the last major jump in performance.) So, it would take twice as much money to make it both perform twice as fast and be twice as efficient. Why spend twice as much money when they can invest the same amount of money, make it more efficient, give up to a 40% performance boost, and sell it for years to come? For starters, AMD surely isn't giving them any good reason to move forward and invest in more R&D. It has also been hyped up so much since the 780M was released, people will buy it no matter what.
The hard truth is: their main goal is to remain profitable. They are not making products with the sole intent to satisfy the every whim of the consumer or overclocker in the world. Even I am disappointed with this but it's completely out of our hands. Plus, why would they invest more when their competition is slacking off?Competition is always good. I guarantee you they'd have a better card if AMD had better cards. If you think about it, what they did with Maxwell (assuming the rumors have been true) will be impressive, nonetheless. It will still have a decent boost in graphics performance over the previous 780M and 880M. I believe early benchmarks [unofficial leaks] suggested that it will perform at least 40% better. That will pretty much destroy whatever AMD has planned and leave NVIDIA on top with nearly 70% market capitalization.
My point was basically to suggest that Maxwell is the foundation of generations to come because of how efficient it is. Now that step is out of the way, they can focus more on performance. Now they have no excuse to avoid what we so desperately want!I'm sure Pascal will be running laps around Maxwell, leaving Kepler in the dust, making the 680M/780M/880M (Kepler brothers) obsolete in that regard.
In other news, the 980M and 970M have been officially added to the NVIDIA website. :thumbsup: Nvidia GTX 980M brings even greater Maxwell energy savings to gaming laptops.
This means that Alienware can now stop screwing around and stick a better GPU in the Alienware 13. The Alienware 13, at the very least, should be able to handle an 870M with 6GB's of VRAM just like the Aorus X3. I sure do hope they offer many upgrades on their website for this new system.reborn2003 and Mr. Fox like this. -
Great post, Brother J.Dre. :thumbsup: I think you've got 'em pegged.
Now, let's just hope I'm wrong about the 18.reborn2003 and J.Dre like this. -
just missing that cost less than the gtx 780!!!
-
A lot of manufacturers are using soldered CPU's now coupled with the new Maxwell GPU's. If Alienware does this with their 17 and 18, I am moving to desktops.TBoneSan likes this. -
Yeah, I won't put up with soldered CPU or GPU garbage. That's pure trash, and it's one of the reasons that the Ultrabook form factor is garbage. Why does it even exist? Because [insert derogatory here] people are willing to accept it and pay money for it... style over substance; form over function.
TBoneSan likes this. -
Eh the 900M series seem to be intentionally reduced in performance to keep it further from the desktop models.
The 980M only has 1536 cores, but I'm sure nVidia could have put in at least 1664 cores like the 970 desktop part.
This only means the 960M will have 1024 cores or less at most.
A 1024 core 960M still wouldn't be bad if it makes its way onto the AW13.
I just feel nVidia could of brought some more performance out of these M cards. Maybe they'll overclock insanely proving they could have been faster from the start. :thumbsup: -
Yeah but 1536 Maxwell cores perform much better than 1536 Kepler cores. 780M vs 980M is a perfect example of this.
NVIDIA's focus was efficiency with Maxwell. Pascal will be about performance. -
To deride the market as as"willing to accept it" is to ignore the fundamental reality that users simply don't care, they have no need and more accurately no desire to. It's not that they are simply stupid, or blissfully unaware, even if you tell them that it's to their long term advantage to have a non-soldered component most have no interest and never will... and that's a fact. In truth, powerful, modifiable laptops (and eventually desktops) are probably going to become oddities one day sad to say... as technology becomes "good enough" it will be simplified and commoditized.... not saying I like it, but that is just a fact but it shouldn't be taken as "well, people are just retards" for not wanting heavily customizable systems", it's just not that simple an assessment. Disliking a market trend is one thing, but just making gross negative generalizations to justify it ignores facing the real facts of the matter and distorts the question.
Now, I am with you on the question of form over function to a point on the trend towards ultrabook designs, but it's important to bear in mind its not just a style question, it's truly a portability/ease of use question which IS function too. Laptops should not be thicker than necessary, but also should not be so thin as to considerably impact structure or heat dissipation during a reasonable lifetime. Makers like MSI, Nvidia, etc are catering to the "turn 'em out, burn 'em out" philosophy, and Dell/AW has bucked this trend but too far to the extent they are not seeking to reach the proper middleground either which is why we are seeing the AW 13 take shape. I don't buy into the line that they will dump it in a year as this design favors the market more than the AW 14, but I do believe Dell/AW is committed to this design footprint that if the clamor is enough over the dual-core that we will see a quad, perhaps as early as the 2nd revision. -
reborn2003 likes this.
-
780M had 1536 cores with a 100W TDP. GTX 680 had 1536 cores @ 195W TDP.
980M has 1536 cores @ 100W TDP. GTX 980 has 2048 cores @ 165W TDP & GTX 970 has 1664 cores @ 145W TDP.
By this logic, the 980M could have easily had 1664 cores, maybe even 1920 cores. What makes this even more logical is that the mobile parts have a 5ghz memory clock at a lower voltage vs the 7ghz memory clock on desktop models. That also reduces heat/power consumption.
I don't know, I really feel nVidia did this on purpose. Wouldn't surprise me if we see a 980MX or 985M in a few months, especially if AMD releases their R9 M295X and it happens to match a 980M(probably won't).reborn2003, Mr. Fox and TBoneSan like this. -
holytoledo951 likes this.
-
It's all about the money. If it's a public company, you can pretty much predict every move they'll make because they have only one goal.
-
Speaking of the 13, has there been any new news on it? I can't seem to find anything. Seems like they're going to be quiet about it until release.
-
Yeah, been really quiet. All of Alienware has this year. Last year they were throwing stuff in our faces.
-
They did announce a bunch of products this year though such as the alienware 13, the alpha console, and the area 51 triangle shaped desktop.
-
I wish they would announce something soon... I'm standing here with my money trying to give it to Dell.
reborn2003 likes this. -
-
Seriously, I hope it turns out to your satisfaction.reborn2003 likes this. -
agree with mr. fox but remember that in any case there is always the 14 and when it comes out on the 13 the price of 14 will come down
-
-
-
I hope they offer more than that old 860M. I think it would be completely acceptable if they offered the 860M on base models, and an 870M 6GB as an upgrade option. If so, the 1600p screen may not be that bad after all. Otherwise, 1080p is the highest I'd recommend to anyone for this system, especially if they're purchasing for the sole purpose of gaming on a portable platform. A 2GB 860M is such an obsolete GPU at this point for 1600p gaming. That combination is not even worth purchasing unless your budget is tight.
reborn2003 and Mr. Fox like this. -
If they can put an 870m in there, they can put a 970m in there.
reborn2003 and Mr. Fox like this. -
But if they keep the AW 14, they probably won't, even if they technically can.Mr. Fox likes this. -
I REALLY hope they keep the 14" and put an 970m in there, that's what I've been waiting for a long time :GEEK:
-
The 970M will come as a soldered option just like the 870M did, so I can see the 970M in a 14. For the 13 probably just a 960M or the 860M.
-
reborn2003 and n=1 like this.
-
Very well said. I don't have the best eyesight, so even 1080p at 17.3" starts to hurt after a while. I don't even want to imagine what 4K at 15.6" or smaller is going to look like.
Mr. Fox likes this. -
I've had a 3200x1800 resolution on a 15.6" display and let me tell you, windows display scaling (even in the 10 technical preview) sucks! Until Microsoft gets their scaling fixed and developers actually implement hidpi aware apps, it will always be better to choose the lower resolution(not 1366x768 but something decent like 1080p).
1080p @ 15.6/17.3/18.4" is fine. I haven't seen it on 13.3/14.1" yet but I can imagine it being okay.
The 1080p 13.3" panel will probably be the most popular option on the 13.
I'm hoping the 1080p panel will be of the high quality IPS variety with good color reproduction. Hopefully we won't have to choose the 2560x1440 option to get the good ips panel with good colors. I think the 1366x768 panel will be a TN panel.
The 860M should handle the 1080p fine for most games, but will struggle for sure on more demanding games. Hopefully nVidia will have a 960M or 965M ready by then. They're supposed to be announcing a 960 desktop model this month I believe. -
-
-
I already said 1080p is the best option. -
-
Well, historically it has been 125% scaling with 1080p. And, 125% is already tacky enough on 1080p... 100% is much better and gives you a clean, undistorted view of the Windows UI.
But, that isn't the issue. Scaling at 100% or even 125% is not sufficient for text to be large enough for comfortable reading at 1440p on a screen smaller than 17" and everything starts getting all jacked up really bad above 125% scaling. We're talking icons, icon text, Windows menus, dialogue boxes, messed up text not being able to fit within open windows of third-party applications... everything goes all screwy above 125% and to fix it by dropping the scaling everything becomes too tiny to read on a small screen... been there, done that... no thanks. Yes, 1080p is the best option on a 15" or smaller screen.
Edit: and it's much worse with Windows 8 than Windows 7 because there are comparatively few UI adjustments in Windows 8. You have to resort to registry hacks and third-party utilities to change some things that Windows 7 (and previous versions) made as as easy a pie using native Control Panel tools that Micro$haft conveniently omitted from Windows 8.n=1 likes this. -
With a 13" laptop you're probably going to have the screen in your face most of the time. Is it really that much of an issue for you guys? I've never thought it was. The laptop I have experience with is the Macbook Pro 13" Retina. The screen wasn't that bad. I'd hate to game on it, but work was fine.
EDIT: Is this a Windows specific issue? If so, then I have no first-hand experience with that. -
Meaker@Sager Company Representative
It comes down to options, at least have a high res and a low res option so people can get what they want.
-
Mac OS X actually scales very well, so it isn't a good example of what we're talking about. -
Alienware better get the ball rolling. It has been more than a year.
-
Think I found something in regards to the pre-ordering of the 13. Taken from here.
woodzstack likes this. -
-
If 13 is that close, then probably 900m will be released on the 14 as well in the same date, don't you guys think?
I guess it make sense -
woodzstack Alezka Computers , Official Clevo reseller.
Mr. Fox likes this. -
If this is true. I doubt it would have been malicious intent by Dell to kill off the 18, so much as they just didn't give a crap to address the issues since a refresh wasn't planned. That and they don't care about being the best of the best anymore.
However, I think AW will refresh the AW18 with new 980m's. They've already invested in the face lift, design and architecture, so it's relatively cost effective for them to send them out for another year sporting 980m's. If they take too much longer making this happen I'd be inclined to agree with Mr Fox's gut feeling.
I wouldn't be disappointed if a Dual GPU 17 came to market, so long as it wasn't a scorching hot soldered gimp.Mr. Fox likes this. -
We'll probably get refreshed 17s/18s with the 980M/970M. Then during 2015, we'll get the R2 with the HM97 chipset and broadwell processors. They can change the design or get rid of the 18 after that. They wouldn't change things so soon. The Alienware 13 is a good example that the current design is staying for a bit. If anything all they'll change is the power button to the old school alien head.
-
But yeah, I don't think they did anything deliberately. In fact, not doing things deliberately is probably the primary reason the 18's performance isn't a whole lot better than what it is. I think things were rushed; Compal probably operated largely on assumptions based on historical engineering data and past experience; and ultimately, there was far too much left to chance. Not enough research (maybe no research) went into the maximum power requirements that 18 overclockers might encounter with a Haswell XM and dual 780M, so we see the artificially low functional limitations that this wounded warrior is hobbled with.
Had they known better, I think they would have either provided a much higher capacity AC adapter or, to avoid having to go to any effort to do better, they may have just scrapped their plans for building another dual GPU machine before the 18 was ever released. If they do kill off the dual-GPU beast now I think it will be due to engineering laziness, not wanting to invest money to design something truly excellent and an absence of commitment to the customers that want a beast... but not anything malicious. I think they prefer shooting fish in a barrel, and the mickey mouse products are just easy money for them.
Alienware 13 Pre-Release Speculation Thread
Discussion in '2015+ Alienware 13 / 15 / 17' started by tinker_xp, Aug 8, 2014.