Nice! I like playing with underwear too!
-
Meaker@Sager Company Representative
It's amazing what the 970m has done for the form factor, cooler and faster than the 870m
-
the core problem with optimus is that you are limited by the performance of iGPU and intel's excessive control over display, if the iGPU sucks too hard then it bottlenecks your dGPU, if it's too good like iris pro then it just becomes a waste and adds on to extra TDP draw because it couldn't be more powerful than the dGPU anyways.
what I don't understand is how come there's no solution for hot swapping between iGPU and dGPU, MUX switches requires a reboot -
back in the day there actually were hot switching solutions in both red and green camps, but then came along the dawn of automatic switchable graphics, a far more "convenient" solution for the "lazy" computer user
-
-
Meaker@Sager Company Representative
The processing power of the IGP is not too important it more comes down to its ability to drive displays and they are improving.
-
I got 60 fps with Metro Last light running on the P650SE with the GTX970M and it stayed not too loud and not too hot.
The optimus system is now handled correctly by the Nvidia graphics proprietary drivers.
THis Clevo clevo is really a great gaming machine.
And with the IPS FullhD screen being not too expensive it's quite a bargain. -
I am looking to order from Xotic PC. Was wondering if I should just get the LG IPS screen or go with the stock screen and replace it if I dont like it. What screens are people replacing the stock screen with and does anyone have any pictures.
-
-
What is your favorit 970m Notebook? Gs 60 or clever p650
-
Meaker@Sager Company Representative
The clevo/sager gets a nice cooling system with more display options and a slot capable of fitting a pci-e ssd.
-
P650.. GS60 is much improved with the new 970M but it's really hard to upgrade which kills it for me..
-
Meaker@Sager Company Representative
-
I loved the GS60. Awesome thin, light, and powerful laptop, with great build construction. But yeah, the difficult to access components killed it for me, along with the battery life and audible fans all the time. Nit picking really, but personally just didn't work for me.
-
I can pick a Notebook for my new Consulting job. I ll travel alot but wanna play games after work at the hotel. Dont know if the clevo is light enough to travel much. I dont need more akku time then 4h and 2.5 cm of thinkness is fine. So the clevo fits perfectly but 2.5kg vs 2kg from gs60 is maybe the edge for backpain or not..
-
u get backpain from 2-3kg? gee, might consider doing some sports
-
Haha idont know. I do sports but i get backpain easy since an accident.. But i really don't know if 550g makes a differents. I prefer the Clevo cause the service from xmg schenker sounds awsome. Don't know how the Msi service is.
-
i guess it depends how u plan to carry the machine around. easiest for ur back would probably be a backpack to equally distribute the weight across ur whole back and legs...
-
I should add my current laptop is a Clevo W150HR. This is the error I get in nvidia-installer.log:
The models I'm looking at replacing it with are the P650SE or the P157SM-A.Last edited: Dec 29, 2014 -
[email protected] Notebook Consultant
Is it a legacy card then, supported by a legacy driver? No, that seems too new of a device. It is a 500M series? I heard there were some non-standard OEMs in that line. -
[email protected] Notebook Consultant
So, in all seriousness, whether you're carrying extra weight on your back, along your sides or (a woman) out front too ...
- Body: Take your breathing, posture, etc... seriously. I did at a my younger age, whether it was my neck or back (and I never had issues, although I blew out my shoulders for being under 150lbs. taking on 250-300lbs. peers), back when I was an athlete, and it still serves me well today as I'm fat as heck now. I don't have issues, even though I have traveled nearly 100% for 10 of the last 12 years.
- Gear: Get the proper equipment to haul equipment. For an 15.6" executive note, get a thick, strong, padded strap. For a 15.6" backpack, get a quality one that rides high and distributes weight well. And in the worst case, get a 15.6" roller. Just because it doesn't go in the overhead bin, and goes under your seat, doesn't mean it cannot be a compact roller.
- Pack: My wife and I did 11 European countries, from central Europe and even Ireland, packing everything into a pair of carry-on, under-the-seat, backpacks, plus only one, 29" upright, with all our clothes for a month. Yes, I pack efficiently, and it saved us in various nations on trains where other tourists (not just Americans, but those from the UK, even France) carrying multiple pieces had at least one stolen.
For us Americans, made-in-USA (San Francisco) Timbuktu products are great, and really distribute the weight into your upper back and shoulders, and not slouching down into your lower back. There are other options too.
That said, we're at the point there are new, 11.6" products becoming available with "decent" GPUs. Nothing like the 970/980m P150 products, but if you're really concerned about another kilo, that's an option. But in reality, your'e going to be carrying a lot of other things.
There's also the option to build a very small, Mini-ITX system. I have a 7x9x11" SilverStone SG05 with a 450W SFX power supply that can run the GTX 970 (needs to be 10" long or less) and an i7, although I'd stick with only 2.5" drives in it as even a single 3.5" is a tight fit (while I've fitted up to 5 x 2.5" before). In the worst case, I can check that into baggage and as long as I'm not traveling international, it usually is fine. It kicks serious butt over a mobile too, and most hotels these days have a TV with HDMI. Best of all, the case (w/450W SFX), an H97 or Z97 board, i7 (or high end i5 will do), 16GiB DRAM and GTX 970 will run you well under $1K too.
Lastly ... consider forgetting a PC to game. The nVidia Shield Tablet is an option, if you don't mind going smaller (8") display, and with only a 192 core GPU. But you'd be surprised how well that runs a lot of 1080p Android titles. And you can stream many Steam games to display on the Tablet, from a PC, if it's around. It works very well, as long as you're on local WiFi. The nVidia Shield Tablet is also a killer tablet that is $200 cheaper than a Nexus 9, better design (especially for today's titles), better speakers, stylist and external controller. But no, it's not a keyboard+mouse gaming system, so that does limit its full potential (had to mention it). -
[email protected] said: ↑It is a 500M series?Click to expand...
[email protected] said: ↑I heard there were some non-standard OEMs in that line.Click to expand...
My current laptop was specced on the basis of needing to run multiple VMs, so the lack of accelerated graphics wasn't too much of an issue. Now that project is over and I've got a lot more free time for gaming so I don't want to commit to a large expense and end up with the same problem. It does seem the newer Clevos work fine, but I want to be sure. -
I'm thinking about picking up the NP8651 and I already have a copy of Windows 7 Home Premium that I could use. Does anyone know if all of the drivers support Windows 7? And has anyone had any trouble with Windows 7 on the NP8651?
-
[email protected] Notebook Consultant
Th4tRedhe4d said: ↑I'm thinking about picking up the NP8651 and I already have a copy of Windows 7 Home Premium that I could use. Does anyone know if all of the drivers support Windows 7? And has anyone had any trouble with Windows 7 on the NP8651?Click to expand...
The only thing I've ever seen that becomes, really, a non-issue -- but an issue with familiarity (for the end-user installer) -- is the firmware being 64-bit uEFI-only, with a very limited Compatibility Service Module (CSM) that prevents legacy 16-bit BIOS boot (e.g., no Int13h Disk Services). But Windows 7 supports a full, 64-bit uEFI installation, and it's all I do today myself (along with GPT disk labels). Multi-booting with other OSes actually becomes much cleaner with uEFI-GPT than legacy BIOS-MBR. As long as you're not bringing over an existing disk, Windows sets up the EFI system partition (ESP) and Microsoft Reserved (MSR) partition in the GPT disk label just fine for just about any use.
robertc64 said: ↑It's a GTX 550M. Hence my current search for confirmation It does seem the newer Clevos work fine, but I want to be sure.Click to expand...
robertc64 said: ↑My current laptop was specced on the basis of needing to run multiple VMs, so the lack of accelerated graphics wasn't too much of an issue.Click to expand...
I'm hoping by the second half of next year we start to see 16GiB 1R (not the current, rare 2R, which limit 1 per channel) SO-UDIMMs, although I've been reading Intel's alleged support of larger ICs in the 80+ series chipsets is not complete. So we'll have to wait on a newer generation to get it, likely with DDR4. I'm in a waiting game now, although I plan on going Clevo and Mythlogic for my next purchase.
robertc64 said: ↑Now that project is over and I've got a lot more free time for gaming so I don't want to commit to a large expense and end up with the same problem. It does seem the newer Clevos work fine, but I want to be sure.Click to expand... -
[email protected] said: ↑I haven't seen a single component or OEM not also release Windows 7 drivers for hardware today. Considering both NT 6.1 (7/2008) and 6.2 (8/2012) use most of the same drivers, because they include virtually the exact same new subsystems, sans GPU which tends to be Windows shell support-specific, I don't think you'll have any issues.Click to expand...
-
Windows 10 is just that, a technical preview, in other words pre-beta more or less. I'm sure Windows 10 drivers will be made available once Windows 10 is released.
-
[email protected] Notebook Consultant
edwardamin13 said: ↑What about Windows 10 technical preview? Will windows 7/8 drivers works there?Click to expand...
Before Windows 10, many subsystems allow NT6 drivers to work on just about NT6 release. Other than GPU, plus a few of the subsystems, plus there are some "new" subsystems to replace legacy ones**, although many applications didn't support them.
**BTW,
my personal favorite "new NT6 subsystem" is Windows Image Acquisition (WIA), which replaces TWAIN. TWAIN was a minimal API that required OEMs to fill-in, and 2 devices often had "vendor conflicts" until a few application vendors finally designed ways to support more than one TWAIN driver with conflicting libraries. Unfortunately, WIA does somethings better, but not a lot, and is not very well supported until software just released the last few years. It is still "crippled" versus 15 year-old open source SANE, which is basically the "Postscript" of the scanning world (has a language, API and extensive subsystem for everything), largely out of "industry alignments." I.e., SANE does network scanning, whereas Microsoft Solution Providers want to sell you a 5-figure (sometimes 6-figure with CALs) network scanning package. It never ceases to amaze me when I install open source SANE and people can pull up the exact same interface on any desktop (and any OS for that matter) on the network in the company to scan a stack of documents, and I was doing that 15 years ago. WIA is still lightweight in comparison, and TWAIN was always inter-vendor heck if you had more than 1 scanner and they bundled the same libraries because MIcrosoft didn't create a real, scanner subsystem.
In defense of Microsoft's often outsourced developers, this happens in a lot of projects. E.g., when Red Hat finally "unproductized" it's 1-year Linux release, no longer charging for anything but the 7+ year (13 years now) backport-sustained versions, they re-initiated the versioning from the then current Red Hat Linux 10 Beta to Fedora Core 1 Test. That caused major headaches in the everything from the build system (that eventually led to Koji) to major changes in the Anaconda installer. I really hate it when "product managers" change things (like channel labels). Fedora should have started at 10, instead of regressing. At least Microsoft went forward on versions, even if it's still almost exactly NT6 underneath. But "product managers" often introduce developer headaches like this, and the string change to "10" from NT "6.x" is more of the same.
edwardamin13 said: ↑I have a license for Win 7 Pro and also considering Win 10 TP. Also, will they release an image for windows 10 consumer preview? Sorry if I'm slightly off topicClick to expand...
Having been there (e.g., "High Touch Beta" and other "Early Adopter" programs -- both OEM and OS vendor), an OEM has to devote resources to working on such. In return, the OS vendor dedicates a point-of-contact to the OEM, and priority routing. It requires major, customer involvement, which I don't see happening with Clevo-Sager. They will likely leverage the HP, IBM-Lenovo and Dell work, like other Tier-2s, and even lower Tier-1s, months later.
I.e., your only option is going to be to get any Windows 10 "early" will be via MSDN or another relationship. Even the few Tier-1 OEMs involved will likely only work with major customer accounts, so there is a 3-stop circle (OS, OEM, user) with dedicated engineers to solve problems in any "Early Access" program. -
[email protected] said: ↑That's a loaded question.
But that all-changes with Windows 10. The "Einsteins" at Microsoft decided to change the string on NT, so Windows 10 won't be NT 6.4, but NT 10[.0]. This is likely to break all sorts of stuff. In the best case, modifying the INI file might still allow it to load a driver. But in the more common case, a NT6 driver is unlikely to work on NT10 because of a least that string change.
Why would an Tier-2 OEM release an image "preview"? I don't see many doing that accept for the "really big" Tier-1 OEMs who have huge swathes of technicians to work with Microsoft.
Having been there (e.g., "High Touch Beta" and other "Early Adopter" programs -- both OEM and OS vendor), an OEM has to devote resources to working on such. In return, the OS vendor dedicates a point-of-contact to the OEM, and priority routing. It requires major, customer involvement, which I don't see happening with Clevo-Sager. They will likely leverage the HP, IBM-Lenovo and Dell work, like other Tier-2s, and even lower Tier-1s, months later.
I.e., your only option is going to be to get any Windows 10 "early" will be via MSDN or another relationship. Even the few Tier-1 OEMs involved will likely only work with major customer accounts, so there is a 3-stop circle (OS, OEM, user) with dedicated engineers to solve problems in any "Early Access" program.Click to expand... -
[email protected] Notebook Consultant
edwardamin13 said: ↑I meant, will Microsoft release a consumer preview after they showed it to the public? But looks like I will just play safe and go with Win7 ProClick to expand...
I.e., your best option is to get to know someone at a local Microsoft Solutions Provider, ideally Gold, as they get all sorts of this info.
As an original NT 3.1 Beta Tester at the largest installed base of the first, native NT application, I used to be on several, internal lists until just a few years ago. Today, I don't know much other than what people feed me indirectly, and my past experiences involved with many programs.
I also keep my MCP credentials current, even though I am usually the "Linux expert" on-site (and sometimes even more MCP credentials than most of the Windows staff, beyond my deep internals). I even was offered, repeatedly and by first-hand, personal referrals, to become a "Master" (on-site at Redmond for 3 weeks) on Directory Server, because of my strong NT and AD-LDAP-Kerberos practice at major customers.
But my access finally got yanked around that time too because I was, despite my extensive assistance in AD migration and related LDIF-fu, considered to be working "outside the Windows ecosphere" and could be in a "competitive situation." I.e., I know it came up during the "Master" program consideration, and the last thing "a few people" (and I stress "a few people," as most were professional about it) in Redmond wanted to see was their public "Master" list of Microsoft employees, key Solutions Providers and others "in the ecosystem" along with this single entry of one "(redacted Linux vendor)" employee.
In other worse, I screwed myself by becoming too well-referred with NT and AD history and internals, and especially in one where I cost Microsoft mid-8 figures a few years back, no less on the desktop-related front with my extensive, embedded Linux and Windows background. Someone made the fatal mistake of saying Linux couldn't be a generic, combinational desktop-PoS Kiosk, and 2 days and 1 prototype later (fully skinned and retail themed in ways impossible to do on Windows), I was into a 12 week project that had about 400 people scrambling to undo the "damage" I caused ... literally and I mean literally overnight.heibk201 likes this. -
[email protected] said: ↑That's a loaded question.
Before Windows 10, many subsystems allow NT6 drivers to work on just about NT6 release. Other than GPU, plus a few of the subsystems, plus there are some "new" subsystems to replace legacy ones**, although many applications didn't support them.
**BTW,
my personal favorite "new NT6 subsystem" is Windows Image Acquisition (WIA), which replaces TWAIN. TWAIN was a minimal API that required OEMs to fill-in, and 2 devices often had "vendor conflicts" until a few application vendors finally designed ways to support more than one TWAIN driver with conflicting libraries. Unfortunately, WIA does somethings better, but not a lot, and is not very well supported until software just released the last few years. It is still "crippled" versus 15 year-old open source SANE, which is basically the "Postscript" of the scanning world (has a language, API and extensive subsystem for everything), largely out of "industry alignments." I.e., SANE does network scanning, whereas Microsoft Solution Providers want to sell you a 5-figure (sometimes 6-figure with CALs) network scanning package. It never ceases to amaze me when I install open source SANE and people can pull up the exact same interface on any desktop (and any OS for that matter) on the network in the company to scan a stack of documents, and I was doing that 15 years ago. WIA is still lightweight in comparison, and TWAIN was always inter-vendor heck if you had more than 1 scanner and they bundled the same libraries because MIcrosoft didn't create a real, scanner subsystem.
I.e., your only option is going to be to get any Windows 10 "early" will be via MSDN or another relationship. Even the few Tier-1 OEMs involved will likely only work with major customer accounts, so there is a 3-stop circle (OS, OEM, user) with dedicated engineers to solve problems in any "Early Access" program.Click to expand...
[email protected] said: ↑In other worse, I screwed myself by becoming too well-referred with NT and AD history and internals, and especially in one where I cost Microsoft mid-8 figures a few years back, no less on the desktop-related front with my extensive, embedded Linux and Windows background. Someone made the fatal mistake of saying Linux couldn't be a generic, combinational desktop-PoS Kiosk, and 2 days and 1 prototype later (fully skinned and retail themed in ways impossible to do on Windows), I was into a 12 week project that had about 400 people scrambling to undo the "damage" I caused ... literally and I mean literally overnight.Click to expand... -
[email protected] Notebook Consultant
heibk201 said: ↑If that's the case then NT 6.2 wouldn't share drivers with NT 6.1 tho because of string names, but as you said they do share drivers, so I'm guessing they are modding ini files in the first place already aren't they before releasing the drivers?Click to expand...
E.g., 6.10 != 6.1 but 6.10 > 6.1, while 6.1.0 = 6.1 and, most importantly here, 6 = { 6.1, 6.1.0, 6.10 }
I.e., now in the case of NT ...
String with NT major version "6" = NT minor versions { 6.0, 6.1, 6.2 }
String with NT major version "10" > NT minor versions { 6.0, 6.1, 6.2 }
Version major, minor, revision, in a string, is extremely common in coding. Look at any INI/INF file, let alone if you disassemble NT binary/library objects. I deal with this a lot because I modify a lot of [open source] code too.
heibk201 said: ↑and a 12 week project was your only consequence lol?Click to expand...
It's a vicious cycle whereby "open source" is used as a threat to get a better deal, even though it locks the vendor into costs long-term. It's business and finance folk only looking at 24, 36 or, best case, 60 months down-the-road. Microsoft aims for 24 months, settles for 60 months. Engineers like myself look at decades, at least 5 years in advance, especially those of us with semiconductor and, far better yet, aerospace (25 year planning) backgrounds. But I do see a lot of business people get fired over this stuff, and blackballed as a result. If they are smart, they leave the company within 3 years -- enough time to "save" money on the sheet, but not long enough for the "future costs" to hit them, hard.
This is not commonly talked about because these are very, very "sensitive" contracts, but extremely common practices, and I've been in so many (all under NDA of course, so I cannot name parties, obviously). Every now and then, the "big ones" hit the media, so everyone knows the "jist" of it, while people like me know all of the gory details. The London Stock Exchange (LSE) finally destroyed Microsoft's final, "last bastion" in FSI. Microsoft's biggest "loss leader" was the London Stock Exchange (LSE), with Microsoft funding Accenture (porting from POSIX/Java to Win32/.NET). Over 5 years, Linux 2.2 to 2.4 to 2.6 response time improved 50x fold over NT, and Accenture had major issues porting.
Once the LSE went down for almost the entire, same day Freddie Mac and Fannie Mae were "nationalized" by the US gov't, that's when they got the boot. Microsoft blamed Accenture, but Accenture was funded by Microsoft (part of the "arrangement"), at a huge loss, so Microsoft didn't lose their "last bastion" of the high-speed, mega-money FSI industry. Sometimes saving money really costs you in the end, especially as the LSE was just getting killed by every Linux-based trading house, which had become the overwhelming majority. Even non-RT Linux was killing heavily RT-modified NT (with 2 patents licensed from a Linux-centric developer house, ironically enough).
Microsoft and .NET have now been regulated to Bloomberg and Reuters-related, non-real-time desktop software in FSI, and they'll never regain that foothold. Oh, you can find plenty of jobs doing that, the majority. But the real money is in the real-time trading, and Windows is no where to be found.
Retail has been a constant battle for them, even controlling Best Buy's distribution at one point -- e.g., that's when Apple products, sans MS Office for Mac, were removed from the shelf overnight (the iPod's timing being their "bargaining chip" to get Mac products back on the shelf within a couple of years), only to lose them. Micros and other industry relationships have attempted to change that, as Microsoft long assumed Windows would get the back-end as much of the front-end by '00, but never planned on a desktop, let alone a better embedded/kiosk platform, solution presenting itself when they only saw Novell as a server-only competitor.
In this case, it was actually the customer's the 2nd time in 7 years they looked to go "standards-compliant," and they came very close as I had Firefox and GNOME completely profiled (mandatory polices -- yes, you can do it, and manage it easily, in Linux with LDAP and/or CM without costly, $300 CAL add-ons for AD) and setup for accessibility limited employees, and it was very locked down compared to Windows. There were huge advantages to the solution (using $99 PoS systems, price at 1K quantity) over Windows, especially in replacement and related support costs outside of software and services (easily $500/year that went away). But ultimately it was a decision they made 6 years earlier that gave them the "reasoning" to stay Windows.
They had built their entire intranet on MS IE 6-only tagging and ActiveX (yes, security issues and all -- which is why so many companies get hacked despite firewalls and layer 7 filtering) which many argued against doing 6 years earlier. This meant their "costs" were heavy in development. This meant there was the added, "conflict of interest" type "incentive" for 400 developers to keep Windows the platform, especially when we provided case studies on reducing the number to 40 for Internet standards-based development. So the arguments weren't technical, even if we provided the major TCO reduction in the next 6 years, compared to the former. As Gartner said oh-so-well in 1999 ... no proprietary vendor will offer a way out of lock-in, a customer must choose to do it and stick with it.
In reality, I don't consider Microsoft proprietary. They purposely break compatibility with their own, undocumented, proprietary formats. That's called purposeful "abandonware." People complain about ODF-based Office suites not doing the latest MS Office formats "perfectly," but even MS Office has issues between versions. At the same time, if you use ODF, it works pretty well everywhere, and when compatibility is "broken" in a new version (e.g., 1.0 -> 1.2), it's very well known and documented, with the older version being "read-only" and the new version being created with many warnings. MS Office 2013 at least does a better job at this with very good warnings, but you have to understand it took 3 versions before Microsoft recognized the "Transitional" 2007 and 2010 had only created new, "abandonware" non-standards too that enterprises and, even more so, governments were tired of.
Case-in-point: MS Office 2007, 2010 and 2013 all have different "Transitional" Office OpenXML (OOXML) formats, which are not ISO standard aka "Strict" OOXML. It was so bad with 2010 "breaking" tags with 2007 (let alone 2008 v. 2011 for Mac, with 2008 missing entire functionality versus Windows), that Microsoft now has no less than three (3) "Compatibility Modes" in 2013 -- including for 2007 and 2010, not just 2003 (like 2007 and 2010). Of course, only 2013 offers a "limited, strict" ISO mode, and several tools don't work, and definitely not by default. Which is why I encourage everyone to setup their MS Office to always produce 2003 formats by default, which are the only formats that MS Office 2007, 2010 and 2013 (let alone 2003 itself) actually "agree upon," at least when the newer tools support 2003.
This is why, even with ISO standardization (don't get me started, ISO OpenDocument documents many MS Office standards better than Microsoft's own, "written overnight" 1/10th the size ISO standard reference), the UK and several other countries are finally starting to mandate OpenDocument Format (ODF) instead. Boeing has supported it since Day 1 (surprise, surprise, #1 engineering technical documentation company in the world needs a 25 year format), Corel-WordPerfect from Day 1 (filters to/from ODF work great) along with IBM-Lotus and many others have adopted it over the years for this reason. Microsoft cannot even make MS Office compatible with MS Office, let alone Mac compatibility (especially sending back documents to Windows) can be a real PITA.
Which is why I always joke, "MS Office for Linux would have as many compatibility issues as MS Office for Mac, whereby long-standing (over a decade), deeply documented -- first by OASIS ("the" XML standardization organization) and, later, ISO -- ODF does not, and never will." If you care about compatibility more than 60 months, you have to break from MS IE and Office, which are purposely designed to keep you upgrading and breaking compatibility. Now that said ... Office 365 does end that, because it's "subscription based," and no longer relies on upgrading, so it will eventually get to ISO OOXML (Office 365 on-line has compatibility issues with legacy MS Office and the 3 "Transitional" formats, but I won't go there), plus solves the "Mac user" issue (and will eventually run on Firefox on Linux). Even Google recognizes Office 365 is becoming a threat to their Docs, so they are moving to add more complete ODF support (along with getting "more serious" about Linux support -- duh, your own Android!).
This separates me from being "just a Linux bigot." People call me one of the most objective "trusted advisors" they've ever met, because I know all of the internals and politics, and how to get them the best deals, including long-term TCO. But in the end, they make their own choices, and I hate being right when it costs them dearly.
In fact, the thing I hate in the IT world is that "job security" phrase. No, you should document yourself out of a job. That's how you grow and get even bigger jobs. If you wonder what is wrong with America as a "producer," instead of the consumer, it's that people think that way -- instead of focusing on efficiency, reliability and ... my personal favorite ... willing to forgo familiarity to see "what else exists."Last edited: Dec 31, 2014heibk201 likes this. -
Not sure if its my fault or the replacement lcd I bought but there's a little horizontal purple streak that developed only noticeable on dark backgrounds. I knew I shouldn't leave anything sitting on top of it kicking myself now. I already ordered another LCD from the same eBay seller nbkit
-
Splintah said: ↑Not sure if its my fault or the replacement lcd I bought but there's a little horizontal purple streak that developed only noticeable on dark backgrounds. I knew I shouldn't leave anything sitting on top of it kicking myself now. I already ordered another LCD from the same eBay seller nbkitClick to expand...
[email protected] said: ↑Sorry, to clarify, version string. Version strings are evaluated very differently, often being broken down in the major, minor, revision, build, etc... That's so they can use integer operations on then.
E.g., 6.10 != 6.1 but 6.10 > 6.1, while 6.1.0 = 6.1.
I.e., now in the case of NT ...
String with NT major version "6" = NT minor versions { 6.0, 6.1, 6.2 }
String with NT major version "10" > NT minor versions { 6.0, 6.1, 6.2 }
Version major, minor, revision, in a string, is extremely common in coding. Look at any INI/INF file, let alone if you disassemble NT binary/library objects. I deal with this a lot because I modify a lot of [open source] code too.
It's the difference between earning only a quarter-mil in services, and a sale of 7 figures ... while, more importantly, ensuring the customer doesn't have major costs and issues (security is a big one) in the future. In the end, the customer cut their nearly 9 figure expense in half, with Microsoft eating half -- not just losing sales, but having to subsidize entire support staffs, with external parties, at the customer. But they will have the same issues for the next 6 years, like they did the prior 6 years by forcing their entire business to use desktops with Windows.
It's a vicious cycle whereby "open source" is used as a threat to get a better deal, even though it locks the vendor into costs long-term. It's business and finance folk only looking at 24, 36 or, best case, 60 months down-the-road. Microsoft aims for 24 months, settles for 60 months. Engineers like myself look at decades, at least 5 years in advance, especially those of us with semiconductor and, far better yet, aerospace (25 year planning) backgrounds. But I do see a lot of business people get fired over this stuff, and blackballed as a result. If they are smart, they leave the company within 3 years -- enough time to "save" money on the sheet, but not long enough for the "future costs" to hit them, hard.
This is not commonly talked about because these are very, very "sensitive" contracts, but extremely common practices, and I've been in so many (all under NDA of course, so I cannot name parties, obviously). Every now and then, the "big ones" hit the media, so everyone knows the "jist" of it, while people like me know all of the gory details. The London Stock Exchange (LSE) finally destroyed Microsoft's final, "last bastion" in FSI. Microsoft's biggest "loss leader" was the London Stock Exchange (LSE), with Microsoft funding Accenture (porting from POSIX/Java to Win32/.NET). Over 5 years, Linux 2.2 to 2.4 to 2.6 response time improved 50x fold over NT, and Accenture had major issues porting.
Once the LSE went down for almost the entire, same day Freddie Mac and Fannie Mae were "nationalized" by the US gov't, that's when they got the boot. Microsoft blamed Accenture, but Accenture was funded by Microsoft (part of the "arrangement"), at a huge loss, so Microsoft didn't lose their "last bastion" of the high-speed, mega-money FSI industry. Sometimes saving money really costs you in the end, especially as the LSE was just getting killed by every Linux-based trading house, which had become the overwhelming majority. Even non-RT Linux was killing heavily RT-modified NT (with 2 patents licensed from a Linux-centric developer house, ironically enough).
Microsoft and .NET have now been regulated to Bloomberg and Reuters-related, non-real-time desktop software in FSI, and they'll never regain that foothold. Oh, you can find plenty of jobs doing that, the majority. But the real money is in the real-time trading, and Windows is no where to be found.
Retail has been a constant battle for them, even controlling Best Buy's distribution at one point -- e.g., that's when Apple products, sans MS Office for Mac, were removed from the shelf overnight (the iPod's timing being their "bargaining chip" to get Mac products back on the shelf within a couple of years), only to lose them. Micros and other industry relationships have attempted to change that, as Microsoft long assumed Windows would get the back-end as much of the front-end by '00, but never planned on a desktop, let alone a better embedded/kiosk platform, solution presenting itself when they only saw Novell as a server-only competitor.
In this case, it was actually the customer's the 2nd time in 7 years they looked to go "standards-compliant," and they came very close as I had Firefox and GNOME completely profiled (mandatory polices -- yes, you can do it, and manage it easily, in Linux with LDAP and/or CM without costly, $300 CAL add-ons for AD) and setup for accessibility limited employees, and it was very locked down compared to Windows. There were huge advantages to the solution (using $99 PoS systems, price at 1K quantity) over Windows, especially in replacement and related support costs outside of software and services (easily $500/year that went away). But ultimately it was a decision they made 6 years earlier that gave them the "reasoning" to stay Windows.
They had built their entire intranet on MS IE 6-only tagging and ActiveX (yes, security issues and all -- which is why so many companies get hacked despite firewalls and layer 7 filtering) which many argued against doing 6 years earlier. This meant their "costs" were heavy in development. This meant there was the added, "conflict of interest" type "incentive" for 400 developers to keep Windows the platform, especially when we provided case studies on reducing the number to 40 for Internet standards-based development. So the arguments weren't technical, even if we provided the major TCO reduction in the next 6 years, compared to the former. As Gartner said oh-so-well in 1999 ... no proprietary vendor will offer a way out of lock-in, a customer must choose to do it and stick with it.
In reality, I don't consider Microsoft proprietary. They purposely break compatibility with their own, undocumented, proprietary formats. That's called purposeful "abandonware." People complain about ODF-based Office suites not doing the latest MS Office formats "perfectly," but even MS Office has issues between versions. At the same time, if you use ODF, it works pretty well everywhere, and when compatibility is "broken" in a new version (e.g., 1.0 -> 1.2), it's very well known and documented, with the older version being "read-only" and the new version being created with many warnings. MS Office 2013 at least does a better job at this with very good warnings, but you have to understand it took 3 versions before Microsoft recognized the "Transitional" 2007 and 2010 had only created new, "abandonware" non-standards too that enterprises and, even more so, governments were tired of.
Case-in-point: MS Office 2007, 2010 and 2013 all have different "Transitional" Office OpenXML (OOXML) formats, which are not ISO standard aka "Strict" OOXML. It was so bad with 2010 "breaking" tags with 2007 (let alone 2008 v. 2011 for Mac, with 2008 missing entire functionality versus Windows), that Microsoft now has no less than three (3) "Compatibility Modes" in 2013 -- including for 2007 and 2010, not just 2003 (like 2007 and 2010). Of course, only 2013 offers a "limited, strict" ISO mode, and several tools don't work, and definitely not by default. Which is why I encourage everyone to setup their MS Office to always produce 2003 formats by default, which are the only formats that MS Office 2007, 2010 and 2013 (let alone 2003 itself) actually "agree upon," at least when the newer tools support 2003.
This is why, even with ISO standardization (don't get me started, ISO OpenDocument documents many MS Office standards better than Microsoft's own, "written overnight" 1/10th the size ISO standard reference), the UK and several other countries are finally starting to mandate OpenDocument Format (ODF) instead. Boeing has supported it since Day 1 (surprise, surprise, #1 engineering technical documentation company in the world needs a 25 year format), Corel-WordPerfect from Day 1 (filters to/from ODF work great) along with IBM-Lotus and many others have adopted it over the years for this reason. Microsoft cannot even make MS Office compatible with MS Office, let alone Mac compatibility (especially sending back documents to Windows) can be a real PITA.
Which is why I always joke, "MS Office for Linux would have as many compatibility issues as MS Office for Mac, whereby long-standing (over a decade), deeply documented -- first by OASIS ("the" XML standardization organization) and, later, ISO -- ODF does not, and never will." If you care about compatibility more than 60 months, you have to break from MS IE and Office, which are purposely designed to keep you upgrading and breaking compatibility. Now that said ... Office 365 does end that, because it's "subscription based," and no longer relies on upgrading, so it will eventually get to ISO OOXML (Office 365 on-line has compatibility issues with legacy MS Office and the 3 "Transitional" formats, but I won't go there), plus solves the "Mac user" issue (and will eventually run on Firefox on Linux). Even Google recognizes Office 365 is becoming a threat to their Docs, so they are moving to add more complete ODF support (along with getting "more serious" about Linux support -- duh, your own Android!).
This separates me from being "just a Linux bigot." People call me one of the most objective "trusted advisors" they've ever met, because I know all of the internals and politics, and how to get them the best deals, including long-term TCO. But in the end, they make their own choices, and I hate being right when it costs them dearly.
In fact, the thing I hate in the IT world is that "job security" phrase. No, you should document yourself out of a job. That's how you grow and get even bigger jobs. If you wonder what is wrong with America as a "producer," instead of the consumer, it's that people think that way -- instead of focusing on efficiency, reliability and ... my personal favorite ... willing to forgo familiarity to see "what else exists."Click to expand... -
[email protected] Notebook Consultant
Th4tRedhe4d said: ↑when usb 3.0 first started MS limits usb 3.0 boot to natively supported usb 3.0 only.Click to expand...
The lack of a SP2 for Windows 7 is the main problem for installers today. Microsoft expects everyone to be an OEM and build a WinPE and other, installer environments with the drivers necessary to install a system. This is getting beyond unhelpful for the typical Windows technician. You literally have to know how to do OEM and/or enterprise-type pre-installation and deployments, modifying images, etc... to do Windows. It's ironic that Linux is now 10x easier to teach in this regard (first-hand dealing with both), but that's another story (especially on imaging v. automated install/configuration -- imaging is not faster, and definitely not better, "in general").
So -- back to the end-user argument -- Microsoft really needs to continue to release updated SPs/SRs so people don't have to know those details, just to re-install. They cannot expect end-users to be IT departments with desktop deployment specialists.
As far as everything else ... more off-topic ...
Th4tRedhe4d said: ↑nah I think you are hating on micro$oft a bit too muchClick to expand...
But the NT boot loaders and general MS flexibility in boot has been pretty much laughable. They always assume they are the only OS on the system, and even have conflicts with their own solutions at times. People assume otherwise, but when you support a lot of platforms and solutions, you see a lot of major differences and resulting limitations in the NT approaches.
Understand my first really "bad taste" of Microsoft as a MCP was back in 1999, when I stumbled upon issues with Exchange's SMTP/RFC822 solution. A 3rd party package was crashing an Exchange server. The MS Solution Provider was blaming my DNS and MX records (how the heck can MX records crash a server?). After sending various malformed RFC822 headers to it, I was able to crash it at will. I reported it to the MS Solutions Provider managing the server, who immediately read me the riot act, and threatened me. By 2003, the same issues were now allowing black hats to compromise and take over Exchange servers, not merely crash them.
Early 2003 was also SQL Slammer, and I was working at a Fortune 20 company. Two fixes unpatched the patch that would have prevented SQL Slammer, and it was only because I had the print version of Network Computing -- with the original article that confirmed what I had already told management from various, prior release notes (all on-line versions were yanked) -- that I saved a lot of people's jobs. It was the first time I had ever seen MS and their Solution Providers throw their own MCPs "under-the-bus," and I really was able to "push back" with detailed information -- in private, of course -- that countered their public statements.
It was then that Microsoft finally started taking security seriously. This was also a Gates move, because before then, security teams at Microsoft were basically "shut out" of meetings. Although the first new "Security Czar" they hired was canned within 6 months, not long after he admitted -- publicly -- that no version of Windows had ever been designed for the Internet. He was a bit too honest.Th4tRedhe4d said: ↑they have definitelymore and more opened up these years.Click to expand...
In fact, Microsoft went to Red Hat first, for an interoperability agreement, because MS Solution Providers were continually losing deals because they could not also support Linux. Red Hat shot it down when Microsoft pulled out the "IP license" contract, so Microsoft was forced to Novell-SuSE. Go forward a few years, after that debacle (and, sadly, the end of Novell-SuSE which are now 2, separate Attachmate companies with a lot of former SuSE people becoming fellow Red Hat colleagues), and Microsoft was back to asking ... as originally ... Red Hat. Red Hat refused the IP agreement again, and Microsoft finally caved on an "interoperability agreement" without an IP agreement.
The rest is history, especially with some key moves Gates made in Microsoft in 2009, despite his reduced authority. I am very, very critical of Gates and some very poor decisions he's made in his time at Microsoft (don't get me started), but several of his late moves were actually very "eye opening." His decision in 2009 to actually push others in Microsoft to finally stop "actively fighting" open source, under countless guises, was a good one. Of course Microsoft continues to court several of my colleagues, and most of them have turned down job offers (very appealing ones, overall compensation-wise), largely because of the continuing IP "issues."
But every company with a proprietary product line has their IP "issues." Only companies like Canonical and Red Hat, which are GPL-centric and make no money on IP, are free from those. Yes, Red Hat continues to register patents, but purely defensive ones, via their Patent Promise shared with an ever increasing number of companies. The legality of the Patent Promise has been covered many times now, as a company cannot promise to share it freely, perpetually, and then yank the terms.
So understand my "views" of Microsoft are from the standpoint of a Microsoft Certified Professional (MCP), and not "hating." It's called being a responsible peer. I don't like the 99% of people who just complain about Microsoft endlessly, over money, etc... They just drown out legitimate concerns of peers. My concern is always how difficult Microsoft makes things for its own professionals, sometimes expensing them.
But yes, eventually I had established myself as a broad expert, so I was brought in as an integrator, and am now a leading consultant in the open source infrastructure space having done most everything from embedded to large scale integrations, plus some custom development here and there. I.e., most people who are "open source" experts are generally "experts" on all sorts of "technologies" and not just "products." E.g., if you are a deep master of Samba services, you actually understand how CIFS/SMB works at the lowest levels. LDAP is extremely useful for AD (where 99% of AD "architects" don't understand it to their detriment), etc...
That's why I am heavily and I stress heavily utilized even by Windows departments. I know exactly why something was done, especially back in the '90s and still affects things today, and the continued hacks and workarounds.
Th4tRedhe4d said: ↑if I remember correctly back in the XP days bootable usb is a complete jokeClick to expand...
I.e., even the uEFI firmware is far more flexible than NT6.
Th4tRedhe4d said: ↑just the fact that office 365 and rumors of windows 10 being subscription based at least means that they are heading towards more compatible platforms (at least within themselves) much like OS X nowadaysClick to expand...
But yes, Microsoft has had to finally take interoperability seriously. They had to. Part of the reason is legacy x86, which allowed their developers to ignore things like data alignment and various issues. With the Internet generation, everything is streamed bytes and network order now. That has forced them to change their approach.
It also helps that .NET is based directly on licensed Java code (originally 1.1 which Microsoft "won" the rights to continue using, even if they could no longer call it "Java," then 1.4+ with the re-license for .NET 2+), which is very POSIX-like and more portable. So the more .NET "infiltrates" Win32/Win64, the more it helps too. That's always been the view from the POSIX (UNIX/Linux) world, more .NET = more portability.
Unfortunately there is just a lot of Win32/x86-only out there. I.e., .NET libraries and features with a C/C++ dependency on a library that is only available for Win32/x86. That's why Mono failed, utterly. I think Mono did a great job of exposing how inter-mingled Win32 with the GUI, such as the task management being tied to WinForms, and requiring the WINE emulator to emulate portions of the Win32 executive, if one built for WinForms (instead of GTK#, which was cross-platform).Last edited: Jan 1, 2015heibk201 likes this. -
hey guys,
i finally recieved my laptop today and i am very excited. now i got the one without os installed because i wanted to install myself. I just finished installing my windows 8 and now im trying to install the drivers from the cd that came with my laptop. Now i want to ask which drivers should install from the cd? and do i need to install both intel grahics driver and nvidia graphics driver?
thanks in advance! -
hslayer said: ↑hey guys,
i finally recieved my laptop today and i am very excited. now i got the one without os installed because i wanted to install myself. I just finished installing my windows 8 and now im trying to install the drivers from the cd that came with my laptop. Now i want to ask which drivers should install from the cd? and do i need to install both intel grahics driver and nvidia graphics driver?
thanks in advance!Click to expand... -
HTWingNut said: ↑Install all drivers... and for GPU Intel first, then Nvidia.Click to expand...
thanks -
hslayer said: ↑i am wondering why its required to install both intel and nvidia. is 970m hybrid grahics card? wouldnt there be a conflict between the two drivers if i have both?
thanksClick to expand...
thank you -
hslayer said: ↑also how can i make sure that 970m is on when im playing games? because i tried playing league of legends and my old laptop with 660m will get around 220 fps while my 970m is getting 110 fps... i dont think this is right. can i disable intel graphics? if yes, how can i do it?
thank youClick to expand... -
ChrisAtsin said: ↑You can set that up in the NVIDIA control panel.Click to expand...
-
hslayer said: ↑i did set it to prefer dedicated graphics but why it that im getting 100 less fps than my old 660m?Click to expand...
-
ChrisAtsin said: ↑Maybe because your video settings are higher or your resolution is higher?Click to expand...
-
Are your power settings on high performance and your hotkey settings on performance?
-
ChrisAtsin said: ↑Are your power settings on high performance and your hotkey settings on performance?Click to expand...
-
hslayer said: ↑yes on both. btw im using the latest driver from the nvidia website.Click to expand...
-
hslayer said: ↑i am wondering why its required to install both intel and nvidia. is 970m hybrid grahics card? wouldnt there be a conflict between the two drivers if i have both?
thanksClick to expand... -
hslayer said: ↑yes on both. btw im using the latest driver from the nvidia website.Click to expand...
-
hslayer said: ↑i did set it to prefer dedicated graphics but why it that im getting 100 less fps than my old 660m?Click to expand...
Did you get the 4k version screen?
Was your old laptop 720P and not 1080P?
Vertical sync disabled? (I do not know the HZ of the 1080p model but maybe the screen is 110-120HZ which would be awesome)
Shadows enabled or disabled on both?
Your 970M should get around 250-300 FPS (Not in aram, aram is not optimized and will get substantially lower fps) just a ballpark guess with vertical sync disabled. -
Meaker@Sager Company Representative
Also check your power profile is set to high performance as this let's the cpu stay clocked up and you are likely to be cpu Cound at those frame rates.
-
Hi, everyone. Typical long time lurker, first time poster over here.
I'm going to be staying on the US for a few months and I'm looking for a laptop to replace my old Asus G53.
I'm just a bit hesitant because I have only had Asus gaming laptops (G51 before G53) and the experience was good enough to stick to it and get the GL551 but I saw some bad reviews about the screen and the soon to be retired 860m.
Soooooo, I came across with what it seems to be an awesome gaming laptop but I just want to know if the price difference is worth it (1400 vs 1000) in terms of build quality and components longevity. The G53 lasted me 3 years before I started to search for replacements and I hope this one could last as long and the screen shouldn't be worse than my G53's TN panel.
Also on a second note, I'm kind of a noob on this m.2 format and wanted to know if this could go along an HDD on the sager. I know it's not the fastest but I thinks it's really cheap and more importantly, it's not an HDD.
Oh, third.Is the Sager's TN panel better than the G53's panel or about the same? I don't think I've used a laptop IPS (not apple) unless the Surface's Pro panels is IPS and in that case is great and I'll definitely been upgrading to it.
Thank you and sorry for the long post.
TL;DR: is this Sager/Clevo $400 better than the GL551?Last edited by a moderator: Jan 17, 2015
Sager NP8651 / Clevo P650SE with GTX 970m First Look
Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by HTWingNut, Nov 5, 2014.