ok, then i will go with the cheaper one
-
Does it worth to pay 80 extra bucks for the 3K IPS screen?
I ordered the standard TN display but now I'm asking myself if it worths ordering that 3K Ips display. -
-
IPS screens have better colour reproduction and good viewing angles, so I'd go with IPS screen. However in your case 3k screen not allow you to play games at high setting, but in return you get better colours and viewing angles. I recommend you to read this review because Clevo 3k comes with same screen.
http://www.notebookcheck.net/Gigabyte-P35X-v3-Notebook-Review.129889.0.html -
-
But, what do you mean with blurry display or blurry downscaling?
I'm ok with the IPS screen's slow image response but I wouldnt like if it shows blurry images when downscaling in games because obviously Im not going to play with 3K.
I just want a 1080p good quality IPS display, but if I need to order a 3K one because there is no 1080 option... -
[email protected] Notebook Consultant
When we'd travel together for our Alma Mater's away games, we used to get flagged by the TSA for extra screenings, especially back in 2002-2005. So I started introducing ourselves to the TSA agents, "Hi, I'm Agent Smith, this is Mr. Anderson." They used to get a kick out of it, because they knew we were likely flagged "SSSS" with those names. And sure enough, we used to be almost every time.Last edited by a moderator: May 12, 2015 -
[email protected] Notebook Consultant
But Intel will only make the Windows driver bootable on a Z97. I seriously doubt it's anything technical that prevents it from booting on other chipsets, as I've dealt with a lot of varying storage solutions connected via PCIe over the years. It's just a matter of the boot loader loading the driver along with the kernel, as the Windows kernel does not include support.
Again, keep in mind that PCIe is really just an I/O bus, so a device directly connected via PCIe requires a lot of software in its driver. It's not like SATA, which is has a full, pre-existing stack and command set. So unless a vendor provides equivalents to all those functions the system expects for storage, it won't work.
In fact, it wouldn't surprise me if there is only preliminary work on Linux support, or if the early driver leverages an, existing but inefficient subsystem, like Serial Bus Protocol 2 (SBP2). I'd have to have one to start debugging it. But again, PCIe is an I/O interconnect. It's used for everything from GPUs to Communication Hardware. It's not designed specifically for any one thing, so one has to develop an entire storage stack to replace something like AHCI for SATA.
Hence the future with NVMx, as well as a native interface on M.2.heibk201 likes this. -
(But maybe internally with the nvidia you choose to render in 4k and then downscale to 2k, but this means that it will be a huge cut in framerate)tiner likes this. -
(But maybe internally with the nvidia you choose to render in 4k and then downscale to 2k, but this means that it will be a huge cut in framerate) -
-
You can always buy an ips panel later for under 100 Euro/$
AUO B156HAN01.2 for ips matte
LG LP156WF4SLC1 fos ips glossy (and better contrast) (that's the one i bought, thank god the original model came to me and it came without dead pixels )
Make sure to ask the resseller if indeed he has those actual screens in stock not just an "alternative that will fit" -
[email protected] Notebook Consultant
'I'll leave this to someone more knowledgeable than me. However, I do know if you were to choose between CL9 1600MHz memory vs CL11 1866MHz memory, the 1600MHz memory would be faster' -- ericc191
As ericc191 pointed out ...
- CL9 1600MHz would be faster than ...
- CL11 1866MHz, which is very likely CL10 at 1600MHz
This is often why timing is far more important than clock. Timing is basically the number of cycles for various access (there are dozens of numbers, but 3-4 are often used), based on the clock. So timing numbers get higher for higher clocks, because the actual latency of the DRAM is always constant -- that's the part that never changes, which is what actually devices the timing @ clock. The higher the numbers, the slower the DRAM ICs are in access times. The lower the numbers, the faster the access times. Fast access times in DRAM are critical for reads.
Personally I decided to buy the M.2 device directly with my system from MythLogic, a 512GB Crucial m550 M.2-SATA unit. I'm also getting an extended warranty and LCD replacement (since it's the 4K model). In the future, when more options and standards become available, I might go with additional M.2-PCIe unit.
ericc191 also mentioned the Crucial MX100 series, which replaces the older m500 line, which replaced further the previous M4 line. It's the commodity price-capacity solution that is about 25% cheaper than the m550. It too is also an option that still uses MLC IIRC.
I.e., if the original OEM didn't design the battery that way, might there have been a reason?
If Li-Ion was an "exact, reproducible science," then everything would be designed to just take X number of Rechargeable CR-V3 batteries. But it's not, and it's up to OEMs to design Li-Ion specifically for units. I tend to not trust aftermarket batteries, or don't buy them until an unit is at least 2-3 years old when I care less for the unit. I have this same argument when it comes to car engines.
E.g., there's a reason GM doesn't sell a C6 ZR1 / C7 Z06 with a 1,000+ hp/ft-lbs of torque, even though there's a large aftermarket that does with far more boost in their superchargers and turbos (2x over). GM warranties the engine (and the entire powertrain) for 100,000 miles. Even a few European supercars use the GM LS/LT engines as their base for this reason, there's just so much headroom to push them much harder than GM ever does.
-
-
Just some more info.
The 1080P screen of the defiance model from PCspecialist is the:
N156HGE-LA1
Better or worse than sager's? -
BAD viewing angle however 45/45/20/45° (left, right, up from bottom). -
Just pulled my trigger on the K56-4M from XMX.de.
- CPU+GPU: Intel i7-4710HQ + 3072 MB NVIDIA Geforce GTX 970M
- One Logo
- RAM: 8192MB DDR3 1600Mhz (1x 8192MB)
- HDD: 500GB 7200upm (I will add a 256GB M550 M.2 SATA later)
- Wireless LAN Intel Dual Band Wireless-AC 7260 2x2 AC+BT
- No Windows (got my own copy of Windows 8.1)
- 24 Month Pickup & Return Guarantee Support
All for 1,158.87EUR (+136€ for SSD), quite a good price I guess . I prefer the 980M but cannot really wait for it as I have to travel soon. Hope that I can run The Witcher 3 at least on high -
[email protected] Notebook Consultant
Right now most of the fabs are running with many designs that are at least 3 years old, and it's always about yields. So if a 12ns DRAM IC has higher yields than a 10ns DRAM IC, they will seriously consider making far more of the former.
Why? They are milking out as much as they can with existing designs, especially with the decreased demand over the last seveals years. DDR3 DRAM has been far more expensive in 2013+ than it was back 2011-2012, because of the extremely supply-side based economics of the semiconductor industry. It wasn't just select events that caused a change in the industry, although they certainly forced them to finally occur.
Until DDR4 really hits volume, we're not going to see much change in designs either. That's also why the 8Gb IC technology (16GiB 1R UDIMMs) aren't appearing, even though it is supported in DDR3 and Intel's LGA-1150 and 80+ series of chipsets (as well as any new AMD platforms since 2012).
E.g., even the P65x can support 64GiB of RAM, if 16GiB 1R SO-DIMMs became available tomorrow. Right now the only 16GiB SO-DIMMs are 2R, often RDIMMs or, in a few cases, UDIMMs where only 1 can be used per channel.
In fact, when 16GiB 1R DIMMs become available, I fully expect DDR4 versions to be first, and cheaper, than DDR3. Switching designs is a multi-billion dollar risk that impacts fiscals for years to come, so they have to sell in high volume and supply perfectly predicted to make any money. -
Can't wait for DDR4, but one idea would be that, since DDR4 will be quite fast, the ability to allocate Vram to the Dgpu, even if it's half the speed of the GDDR5, it's always better than having a saturated GDDR5 vram memory.
-
[email protected] Notebook Consultant
E.g., the "theoretical bandwidth" quoted for most Integrated Graphics Processors (IGPs) using an unified memory architecture (UMA) for shading, textures, etc... is estimated differently than those with VRAM, precisely because there are serious issues doing shading, textures, etc... at the same time as the framebuffer.
Of course, if you're just running a desktop, framebuffer is mainly all you need. This is still the case with Aero (Windows), QuartzExtreme (MacOS X), Clutter (Mutter aka GNOME Shell) or Compiz (various Linux), all largely just 3D framebuffer.
Cadence, a major EDA tool (EDA ~ "CAD for semiconductor") vendor and leading semiconductor IP holder, published an article, along with their presentation at ARM TechCon, several years ago about this issue with DDR4.
- ARM TechCon Paper: Why DRAM Latency is Getting Worse - Industry Insights - Cadence Blogs - Cadence Community
Cadence itself designed one of the very first DDR4 controllers and ICs that many others have licensed. I used their layout tools during my brief semiconductor career ('99-'01).
In fact, one could argue (loosely) that using system memory for a GPU is like using SATA for NAND
In fact, in the near future, we'll probably start to see NAND just soldiered on a system board, in addition to the new M.2 memory interface (plus legacy SATA options).
Taking that one step further, I've more recently stated there's no reason why 16GB (or even 32GB) of inexpensive MLC (or even TLC) could not be soldiered on "performance-marketed" mainboard for less than $5 ($10) in manufacturer these days. Under Windows, using Intel's SRT, every platter would have a NAND read cache. I.e., automagically turn all hard drives into hybrid SSHDs. If it ever fails, no data is lost, because it's all backed by the platter (it's just a NAND-based read cache).
That's little different than those few mainboard designs that include just enough discrete GPU to do 3D framebuffer, while using UMA for textures and other stores.Oranjoose likes this. -
-
Ok interesting. Also, thanks for all of your other answers. I do plan on getting a 2tb storage mechanical drive at 5200RPM in the main HDD bay, and after that what are my options? I can't use a 1TB Evo 2.5" SSD with the 2TB HDD taking up that space right? I'm a little concerned with the instability of raid 0 (if I used two m.2s) in light of its meager performance boosts over something like a 1TB Sata SSD, but could I even use the 1TB Sata SSD (like an 850 evo or something) with the 2TB HDD in the same laptop? If not, I'll end up going with 2 Crucial M.2 Sata M550s @512 GB each and run them in Raid 0. I'll be getting the laptop without an operating system so Raid 0 is just something I can set up in the bios right? Cause I don't want to have to install windows on the 2tb drive just to set up the other m.2s into raid 0 configuration and then have to reinstall windows again on the SSDs haha.
Thx for help again.ericc191 likes this. -
[email protected] Notebook Consultant
E.g., if the tops of each drive face each other, inverted, then it might be possible to put a 5mm high NAND 2.5" drive with a 9.5mm high platter 2.5" drive. But if drives stack atop of one another, same horizontal orientation, then the SATA connector edges are 7mm from one another, and this becomes a non-option unless the drives are both 7mm (or less).
Typically in a server, with multi-user access, RAID-0+1 doesn't offer much over RAID-1 (independent reads from each device). But in a single user desktop, the RAID-0 performance can be a major advantage for sequential reads (and writes for that matter), while the +1 (interleaved mirroring on each device) still offers full redundancy (although at the cost of half the storage).
I really hate how Intel and Microsoft don't have more options in their base solutions.
Install \WINDOWS to the platter drive (C: ), and then put all games (e.g., Steam) in \Program Files on the RAID-0 NAND drive (D: ). That way, if you lose one NAND device, it's just a matter of replacing one of them and re-installing Steam.
You can also use a portion of the RAID-0 NAND to regularly backup the C: drive. That way, if your platter fails, it's very fast to restore Windows to a new platter from the RAID-0 NAND.
This would also work if the 2.5" drive with \WINDOWS (C: ) is NAND too. -
-
Sent from my Nexus 5ericc191 likes this. -
so then i guess a 4K display would be an even better interpolator than your 3K screen
-
I have other engineer friends that respond similar to the way you do as well which is cute, some even loathe him oddly lol. Also yes I'm positive he wouldn't survive in the real word as an engineer since he is an actor after all
Anyways I was teasing because of the similarities that's all, no offense was intended <3
-
-
ericc191 likes this.
-
Meaker@Sager Company Representative
It will be cheaper yes, the advantage from the shop is the warranty and lack of hassle.
ericc191 likes this. -
Simple math folks!
1080 screen has total pixel count of 2073600 (1920 x 1080)
3K screen has total pixel count of 4665600 (2880 x 1620). So ratio is 2.25:1 when compared to 1080
4k screen has total pixel count of 8294400 (3840 x 2160). So ratio is 4:1 when compared to 1080
Now imagine you are in a 4K screen looking at 1 black pixel which is of, say, square shape. When you reduce your resolution (downscale) to 1080, then 4 black square pixels are utilized to represent that same 1 black square pixel. Ultimately, you end up seeing 1 black square itself (4 squares arranged to look like one square) after downscaling. Hence no noticeable loss of image quality.
In case of 3K screen, the 1 black square pixel needs to be distributed across 2.25 black square pixels when downscaled to 1080 resolution. How can you geometrically arrange 2.25 squares to look like 1 proper square? Not possible without some pixel adjustment (rendering) to make it look almost like a square. This is where the loss of quality comes from. It may not be visible to everyone as the panel packs millions of pixels on 15 inch screen. But few can notice the difference upon looking closely in form of blurriness or loss of sharpness. -
-
[email protected] Notebook Consultant
This is all off-topic ...
One doesn't have to show knowledge or intelligence to crack a great joke. I mean, Blue Collar TV and related redneck stuff is a heck of a lot better.
The problem with the Big Bang Theory is it's all fake geeks at work. They force inaccuracies through jokes as part of the punchline. I wish they just wouldn't go there.
Seinfield was a great example of how to write a sitcom that didn't fire off constant sexual innuendos, cut downs and or insult the intelligence of their viewership.
But every time someone has Big Bang Theory on, I just cannot laugh. And I usually have to leave the room if I cannot turn it off. It's honestly that bad. I'd rather watch a reality show, and I really dislike reality shows.
And Shelton is a total asshat. I don't find him funny at all. I just cannot laugh at it, and his faux knowledge. I also don't know anyone like him in the real world, sans maybe some teenagers, maybe a couple of interns, but they grow out of it by the time they reach Sheldon's age. I guess that's why I just absolutely don't like the Sheldon character.
I constantly work in a customer-facing role, and I've never had to deal with a Sheldon.Oranjoose, ghegde, ericc191 and 1 other person like this. -
[email protected] Notebook Consultant
-
For down scaling to non-native resolutions, at normal distance, you will not notice the difference at all. But if you can see thru a lens, maybe you can notice. Try this - type letter "I" on notepad on higher resolution, then downscale to non native resolution and view with lens again, you will see some grey pixels (blurriness) around the periphery of "I" letter. -
-
I just went back to look at the site, and I see no mention of the AC adapter at all, at least not in the configurator. I'm pretty sure it was there when I ordered. Weird. I know other config items have changed, like the 7260 got upgraded to 7265, but still... that's something they should have told me about.
Edit: Gotta love the small print: Specifications are subject to change without notice
Edit 2: The Sager site still says 180W, so I'm hopeful.
Edit 3: I've sent an email to LPC Digital asking for clarification, I'll let you know what they tell me.ericc191 likes this. -
What confuses me is all the orders coming from resellers are fulfilled by Sager, it is Sager who assembles the laptop and ships out from one location - CA. Then why different specs from different resellers?
Or is it that resellers website is not reflecting the correct specs and one needs to refer to Sager's spec on their website to know what they can expect? -
-
Assuming all sager will be shelled to the retailers where they will install the ram, HDD, OS and any other special services, monitor calibrations and thermal paste upgrades that the customer orders
-
-
-
-
Weird I figured since it seems so many HDD configurations, promotions that was up to the shop
-
Sent from my Nexus 5 using Tapatalk -
-
Also, I think what was being described was upscaling, not downscaling - when 1 pixel at 1920x1080 gets scaled up to 4 pixels at 3840x2160. -
-
Edit: I confirm their website is updated as well. Great job Dabeer! :thumbsup:Dabeer likes this. -
However, I always assumed they built them from the base Clevos. Like they had them at their shop ready to be customized. If they all just come from Sager to us (without choosing any major customizations), then the warranty is the only differentiating factor? Could a reseller please address this so we can know how it works.
*** Official Clevo P65xSA/SE/SG / Sager NP8650/51/52 Owner´s Lounge ***
Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by jaybee83, Oct 13, 2014.