The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    *** Official Clevo P65xSA/SE/SG / Sager NP8650/51/52 Owner´s Lounge ***

    Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by jaybee83, Oct 13, 2014.

  1. pico78

    pico78 Notebook Enthusiast

    Reputations:
    0
    Messages:
    35
    Likes Received:
    6
    Trophy Points:
    16
    ok, then i will go with the cheaper one :)
     
  2. tiner

    tiner Notebook Evangelist

    Reputations:
    101
    Messages:
    435
    Likes Received:
    16
    Trophy Points:
    31
    Does it worth to pay 80 extra bucks for the 3K IPS screen?

    I ordered the standard TN display but now I'm asking myself if it worths ordering that 3K Ips display.
     
  3. wickette

    wickette Notebook Deity

    Reputations:
    241
    Messages:
    1,006
    Likes Received:
    495
    Trophy Points:
    101
    3k means : a beautiful, vibrant display, bad performance in games, if you play only non demanding games then chose the 3k, but take into consideration that downscaling back to 1080P will give you a blurry display
     
  4. bigspin

    bigspin My Kind Of Place

    Reputations:
    632
    Messages:
    3,952
    Likes Received:
    566
    Trophy Points:
    181
    IPS screens have better colour reproduction and good viewing angles, so I'd go with IPS screen. However in your case 3k screen not allow you to play games at high setting, but in return you get better colours and viewing angles. I recommend you to read this review because Clevo 3k comes with same screen.

    http://www.notebookcheck.net/Gigabyte-P35X-v3-Notebook-Review.129889.0.html
     
  5. Dabeer

    Dabeer Notebook Evangelist

    Reputations:
    357
    Messages:
    633
    Likes Received:
    204
    Trophy Points:
    56
  6. tiner

    tiner Notebook Evangelist

    Reputations:
    101
    Messages:
    435
    Likes Received:
    16
    Trophy Points:
    31
    But, what do you mean with blurry display or blurry downscaling?

    I'm ok with the IPS screen's slow image response but I wouldnt like if it shows blurry images when downscaling in games because obviously Im not going to play with 3K.

    I just want a 1080p good quality IPS display, but if I need to order a 3K one because there is no 1080 option...
     
  7. b.j.smith@ieee.org

    [email protected] Notebook Consultant

    Reputations:
    303
    Messages:
    279
    Likes Received:
    175
    Trophy Points:
    56
    As with most traditional engineers, I very much dislike that show. I'm not into sitcoms much at all, but that show really insults my intelligence.

    You're not the first on a forum to suggest such. But Sheldon's character is an idiot, and in real life, he would not survive. Of course, when anyone on a forum meets me IRL, they immediately realize I'm nothing like Sheldon.

    My best friend's name is Anderson.

    When we'd travel together for our Alma Mater's away games, we used to get flagged by the TSA for extra screenings, especially back in 2002-2005. So I started introducing ourselves to the TSA agents, "Hi, I'm Agent Smith, this is Mr. Anderson." They used to get a kick out of it, because they knew we were likely flagged "SSSS" with those names. And sure enough, we used to be almost every time.
     
    Last edited by a moderator: May 12, 2015
    Oranjoose and ericc191 like this.
  8. b.j.smith@ieee.org

    [email protected] Notebook Consultant

    Reputations:
    303
    Messages:
    279
    Likes Received:
    175
    Trophy Points:
    56
    From what I read, Intel's driver for Windows works on any chipset, and allows any system to use the storage. I mean, in the end, it's just a PCIe interface, so as long as there is a driver with all functions.

    But Intel will only make the Windows driver bootable on a Z97. I seriously doubt it's anything technical that prevents it from booting on other chipsets, as I've dealt with a lot of varying storage solutions connected via PCIe over the years. It's just a matter of the boot loader loading the driver along with the kernel, as the Windows kernel does not include support.

    Again, keep in mind that PCIe is really just an I/O bus, so a device directly connected via PCIe requires a lot of software in its driver. It's not like SATA, which is has a full, pre-existing stack and command set. So unless a vendor provides equivalents to all those functions the system expects for storage, it won't work.

    In fact, it wouldn't surprise me if there is only preliminary work on Linux support, or if the early driver leverages an, existing but inefficient subsystem, like Serial Bus Protocol 2 (SBP2). I'd have to have one to start debugging it. But again, PCIe is an I/O interconnect. It's used for everything from GPUs to Communication Hardware. It's not designed specifically for any one thing, so one has to develop an entire storage stack to replace something like AHCI for SATA.

    Hence the future with NVMx, as well as a native interface on M.2.
     
    heibk201 likes this.
  9. pico78

    pico78 Notebook Enthusiast

    Reputations:
    0
    Messages:
    35
    Likes Received:
    6
    Trophy Points:
    16
    Blurry because downscaling will have a factor of 1.5 (3k)->(2k aka 1080p) instead of 2 (4k)->(2k aka 1080p), that means that if you scale down from 4k to 2k, you just dont show every second pixel... this is not possible with 3k to 2k...
    (But maybe internally with the nvidia you choose to render in 4k and then downscale to 2k, but this means that it will be a huge cut in framerate)
     
    tiner likes this.
  10. pico78

    pico78 Notebook Enthusiast

    Reputations:
    0
    Messages:
    35
    Likes Received:
    6
    Trophy Points:
    16
    Blurry because downscaling will have a factor of 1.5 (3k)->(2k aka 1080p) instead of 2 (4k)->(2k aka 1080p), that means that if you scale down from 4k to 2k, you just dont show every second pixel... this is not possible with 3k to 2k...
    (But maybe internally with the nvidia you choose to render in 4k and then downscale to 2k, but this means that it will be a huge cut in framerate)
     
  11. tiner

    tiner Notebook Evangelist

    Reputations:
    101
    Messages:
    435
    Likes Received:
    16
    Trophy Points:
    31
    So, better to buy a 1080p native panel and stay with that TN model for the moment.
     
  12. wickette

    wickette Notebook Deity

    Reputations:
    241
    Messages:
    1,006
    Likes Received:
    495
    Trophy Points:
    101
    You can always buy an ips panel later for under 100 Euro/$
    AUO B156HAN01.2 for ips matte
    LG LP156WF4SLC1 fos ips glossy (and better contrast) (that's the one i bought, thank god the original model came to me and it came without dead pixels )

    Make sure to ask the resseller if indeed he has those actual screens in stock not just an "alternative that will fit"
     
  13. b.j.smith@ieee.org

    [email protected] Notebook Consultant

    Reputations:
    303
    Messages:
    279
    Likes Received:
    175
    Trophy Points:
    56
    ericc191 really nailed it with this ...

    'I'll leave this to someone more knowledgeable than me. However, I do know if you were to choose between CL9 1600MHz memory vs CL11 1866MHz memory, the 1600MHz memory would be faster' -- ericc191

    As ericc191 pointed out ...
    - CL9 1600MHz would be faster than ...
    - CL11 1866MHz, which is very likely CL10 at 1600MHz

    This is often why timing is far more important than clock. Timing is basically the number of cycles for various access (there are dozens of numbers, but 3-4 are often used), based on the clock. So timing numbers get higher for higher clocks, because the actual latency of the DRAM is always constant -- that's the part that never changes, which is what actually devices the timing @ clock. The higher the numbers, the slower the DRAM ICs are in access times. The lower the numbers, the faster the access times. Fast access times in DRAM are critical for reads.

    Yes, you can absolutely do such.

    Personally I decided to buy the M.2 device directly with my system from MythLogic, a 512GB Crucial m550 M.2-SATA unit. I'm also getting an extended warranty and LCD replacement (since it's the 4K model). In the future, when more options and standards become available, I might go with additional M.2-PCIe unit.

    ericc191 also mentioned the Crucial MX100 series, which replaces the older m500 line, which replaced further the previous M4 line. It's the commodity price-capacity solution that is about 25% cheaper than the m550. It too is also an option that still uses MLC IIRC.

    It's very unlikely there would be an increase of aggregate, stored energy for the same technology in the same volume. But furthermore .. if there was an aftermarket battery that offered such, would you trust it

    I.e., if the original OEM didn't design the battery that way, might there have been a reason?

    If Li-Ion was an "exact, reproducible science," then everything would be designed to just take X number of Rechargeable CR-V3 batteries. But it's not, and it's up to OEMs to design Li-Ion specifically for units. I tend to not trust aftermarket batteries, or don't buy them until an unit is at least 2-3 years old when I care less for the unit. I have this same argument when it comes to car engines.

    E.g., there's a reason GM doesn't sell a C6 ZR1 / C7 Z06 with a 1,000+ hp/ft-lbs of torque, even though there's a large aftermarket that does with far more boost in their superchargers and turbos (2x over). GM warranties the engine (and the entire powertrain) for 100,000 miles. Even a few European supercars use the GM LS/LT engines as their base for this reason, there's just so much headroom to push them much harder than GM ever does.

    People pay a lot of money to tune Windows in the professional world. If you are able to do such yourself, more power to you.
     
    Oranjoose, ericc191 and Sandwhale like this.
  14. bigspin

    bigspin My Kind Of Place

    Reputations:
    632
    Messages:
    3,952
    Likes Received:
    566
    Trophy Points:
    181
    I brought old HX321LS11IBK2/16 and it have bit better timing ( 11-11-11), model you listed is the new version however I don't know the reason behind the change.
     
  15. tiner

    tiner Notebook Evangelist

    Reputations:
    101
    Messages:
    435
    Likes Received:
    16
    Trophy Points:
    31
    Just some more info.

    The 1080P screen of the defiance model from PCspecialist is the:
    N156HGE-LA1

    Better or worse than sager's?
     
  16. wickette

    wickette Notebook Deity

    Reputations:
    241
    Messages:
    1,006
    Likes Received:
    495
    Trophy Points:
    101
    Based on panelook a minimum constrast of 600:1 and ~300cd/cm2 for the luminosity, that's a good screen BUT not fond of CHI MEI.
    BAD viewing angle however 45/45/20/45° (left, right, up from bottom).
     
  17. ldkv

    ldkv Notebook Consultant

    Reputations:
    26
    Messages:
    148
    Likes Received:
    11
    Trophy Points:
    31
    Just pulled my trigger on the K56-4M from XMX.de.

    - CPU+GPU: Intel i7-4710HQ + 3072 MB NVIDIA Geforce GTX 970M
    - One Logo
    - RAM: 8192MB DDR3 1600Mhz (1x 8192MB)
    - HDD: 500GB 7200upm (I will add a 256GB M550 M.2 SATA later)
    - Wireless LAN Intel Dual Band Wireless-AC 7260 2x2 AC+BT
    - No Windows (got my own copy of Windows 8.1)
    - 24 Month Pickup & Return Guarantee Support

    All for 1,158.87EUR (+136€ for SSD), quite a good price I guess :). I prefer the 980M but cannot really wait for it as I have to travel soon. Hope that I can run The Witcher 3 at least on high :D
     
  18. b.j.smith@ieee.org

    [email protected] Notebook Consultant

    Reputations:
    303
    Messages:
    279
    Likes Received:
    175
    Trophy Points:
    56
    IC cost, availability, etc...

    Right now most of the fabs are running with many designs that are at least 3 years old, and it's always about yields. So if a 12ns DRAM IC has higher yields than a 10ns DRAM IC, they will seriously consider making far more of the former.

    Why? They are milking out as much as they can with existing designs, especially with the decreased demand over the last seveals years. DDR3 DRAM has been far more expensive in 2013+ than it was back 2011-2012, because of the extremely supply-side based economics of the semiconductor industry. It wasn't just select events that caused a change in the industry, although they certainly forced them to finally occur.

    Until DDR4 really hits volume, we're not going to see much change in designs either. That's also why the 8Gb IC technology (16GiB 1R UDIMMs) aren't appearing, even though it is supported in DDR3 and Intel's LGA-1150 and 80+ series of chipsets (as well as any new AMD platforms since 2012).

    E.g., even the P65x can support 64GiB of RAM, if 16GiB 1R SO-DIMMs became available tomorrow. Right now the only 16GiB SO-DIMMs are 2R, often RDIMMs or, in a few cases, UDIMMs where only 1 can be used per channel.

    In fact, when 16GiB 1R DIMMs become available, I fully expect DDR4 versions to be first, and cheaper, than DDR3. Switching designs is a multi-billion dollar risk that impacts fiscals for years to come, so they have to sell in high volume and supply perfectly predicted to make any money.
     
  19. wickette

    wickette Notebook Deity

    Reputations:
    241
    Messages:
    1,006
    Likes Received:
    495
    Trophy Points:
    101
    Can't wait for DDR4, but one idea would be that, since DDR4 will be quite fast, the ability to allocate Vram to the Dgpu, even if it's half the speed of the GDDR5, it's always better than having a saturated GDDR5 vram memory.
     
  20. b.j.smith@ieee.org

    [email protected] Notebook Consultant

    Reputations:
    303
    Messages:
    279
    Likes Received:
    175
    Trophy Points:
    56
    First off, VRAM is typically multi-ported, has asymmetric access options, among other things. So it's not so much about being a "faster" or having more "width" (e.g., interleaved 64-bit "dual channel" versus 128-384 bit wide) memory bus, but its overall design is much better from the standpoint of GPU usage.

    E.g., the "theoretical bandwidth" quoted for most Integrated Graphics Processors (IGPs) using an unified memory architecture (UMA) for shading, textures, etc... is estimated differently than those with VRAM, precisely because there are serious issues doing shading, textures, etc... at the same time as the framebuffer.

    Of course, if you're just running a desktop, framebuffer is mainly all you need. This is still the case with Aero (Windows), QuartzExtreme (MacOS X), Clutter (Mutter aka GNOME Shell) or Compiz (various Linux), all largely just 3D framebuffer.
    Secondly, we're going to hit a point of "diminishing returns" in the future with faster and faster synchronous clocks for DRAM.

    Cadence, a major EDA tool (EDA ~ "CAD for semiconductor") vendor and leading semiconductor IP holder, published an article, along with their presentation at ARM TechCon, several years ago about this issue with DDR4.
    - ARM TechCon Paper: Why DRAM Latency is Getting Worse - Industry Insights - Cadence Blogs - Cadence Community

    Cadence itself designed one of the very first DDR4 controllers and ICs that many others have licensed. I used their layout tools during my brief semiconductor career ('99-'01).

    In fact, one could argue (loosely) that using system memory for a GPU is like using SATA for NAND

    In fact, in the near future, we'll probably start to see NAND just soldiered on a system board, in addition to the new M.2 memory interface (plus legacy SATA options).

    Taking that one step further, I've more recently stated there's no reason why 16GB (or even 32GB) of inexpensive MLC (or even TLC) could not be soldiered on "performance-marketed" mainboard for less than $5 ($10) in manufacturer these days. Under Windows, using Intel's SRT, every platter would have a NAND read cache. I.e., automagically turn all hard drives into hybrid SSHDs. If it ever fails, no data is lost, because it's all backed by the platter (it's just a NAND-based read cache).

    That's little different than those few mainboard designs that include just enough discrete GPU to do 3D framebuffer, while using UMA for textures and other stores.
     
    Oranjoose likes this.
  21. Sandwhale

    Sandwhale Notebook Consultant

    Reputations:
    69
    Messages:
    123
    Likes Received:
    83
    Trophy Points:
    41
    When you same same screen you mean a 3k IPS screen right? Not the exact same model?
     
  22. Sandwhale

    Sandwhale Notebook Consultant

    Reputations:
    69
    Messages:
    123
    Likes Received:
    83
    Trophy Points:
    41

    Ok interesting. Also, thanks for all of your other answers. I do plan on getting a 2tb storage mechanical drive at 5200RPM in the main HDD bay, and after that what are my options? I can't use a 1TB Evo 2.5" SSD with the 2TB HDD taking up that space right? I'm a little concerned with the instability of raid 0 (if I used two m.2s) in light of its meager performance boosts over something like a 1TB Sata SSD, but could I even use the 1TB Sata SSD (like an 850 evo or something) with the 2TB HDD in the same laptop? If not, I'll end up going with 2 Crucial M.2 Sata M550s @512 GB each and run them in Raid 0. I'll be getting the laptop without an operating system so Raid 0 is just something I can set up in the bios right? Cause I don't want to have to install windows on the 2tb drive just to set up the other m.2s into raid 0 configuration and then have to reinstall windows again on the SSDs haha.

    Thx for help again.
     
    ericc191 likes this.
  23. b.j.smith@ieee.org

    [email protected] Notebook Consultant

    Reputations:
    303
    Messages:
    279
    Likes Received:
    175
    Trophy Points:
    56
    I'm curious how the SATA connector edges are located in the bay.

    E.g., if the tops of each drive face each other, inverted, then it might be possible to put a 5mm high NAND 2.5" drive with a 9.5mm high platter 2.5" drive. But if drives stack atop of one another, same horizontal orientation, then the SATA connector edges are 7mm from one another, and this becomes a non-option unless the drives are both 7mm (or less).

    I'm still shocked more (at least non-Linux) systems don't do 2-drive RAID-0+1, especially with NAND devices that don't have seek issues.

    Typically in a server, with multi-user access, RAID-0+1 doesn't offer much over RAID-1 (independent reads from each device). But in a single user desktop, the RAID-0 performance can be a major advantage for sequential reads (and writes for that matter), while the +1 (interleaved mirroring on each device) still offers full redundancy (although at the cost of half the storage).

    I really hate how Intel and Microsoft don't have more options in their base solutions.

    You could do this ...

    Install \WINDOWS to the platter drive (C: ), and then put all games (e.g., Steam) in \Program Files on the RAID-0 NAND drive (D: ). That way, if you lose one NAND device, it's just a matter of replacing one of them and re-installing Steam.

    You can also use a portion of the RAID-0 NAND to regularly backup the C: drive. That way, if your platter fails, it's very fast to restore Windows to a new platter from the RAID-0 NAND.

    This would also work if the 2.5" drive with \WINDOWS (C: ) is NAND too.
     
  24. Addy246

    Addy246 Notebook Consultant

    Reputations:
    0
    Messages:
    109
    Likes Received:
    28
    Trophy Points:
    41
    Yeah, and besides it would save you couple bucks if you get the parts yourself. That is why I am planning to get the basic configuration like 8 gigs ram, platter HD, etc. Later I will buy the SSDs and extra ram for less price. Example: Going up from 8 gigs to 16gigs ram is costing $90 while configuring. But 8 gig modules out there are like 70 bucks a piece.
     
    ericc191 and Sandwhale like this.
  25. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    Actually 1080p looks sharp on a 3K screen. Can confirm this because I own one. It may be a different panel to the one Clevo uses though. It's extremely good at interpolating with no noticeable blur in games.

    Sent from my Nexus 5
     
    ericc191 likes this.
  26. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    so then i guess a 4K display would be an even better interpolator than your 3K screen ;)
     
  27. LunaP

    LunaP Dame Ningen

    Reputations:
    946
    Messages:
    999
    Likes Received:
    1,102
    Trophy Points:
    156
    Most people are aware that what's said on the show isn't factual for the most part as the intent is for comedic purposes, NOT to educate anyone, however that is the one KEY thing I notice that engineers in the real world share w/ those on the show and that's the diminishing sense of humor once you're at a state of knowledge that's above others lol. Comedy shouldn't ever have to insult anyone, for that very reason if you're insulted you may be taking it to seriously.

    I have other engineer friends that respond similar to the way you do as well which is cute, some even loathe him oddly lol. Also yes I'm positive he wouldn't survive in the real word as an engineer since he is an actor after all ;)

    Anyways I was teasing because of the similarities that's all, no offense was intended <3 :D

    Mr 4k here is trying to spread the gospel xD
     
    adampk17 and Sandwhale like this.
  28. marcos669

    marcos669 Notebook Evangelist

    Reputations:
    359
    Messages:
    300
    Likes Received:
    71
    Trophy Points:
    41
    I suppose that you don't have a decent camera to make a couple of photos, because you know that different people could have different perception of it
     
  29. wickette

    wickette Notebook Deity

    Reputations:
    241
    Messages:
    1,006
    Likes Received:
    495
    Trophy Points:
    101
    Tried it too, and didn't find it sharp enough ;). I don't know how tolerating you can be regarding sharpness, but for me downscaling, that's a no no. When in motion, the downscaling creates a weird "fuzzy" effect on the screen that is annoying.
     
    ericc191 likes this.
  30. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,189
    Likes Received:
    17,898
    Trophy Points:
    931
    It will be cheaper yes, the advantage from the shop is the warranty and lack of hassle.
     
    ericc191 likes this.
  31. Addy246

    Addy246 Notebook Consultant

    Reputations:
    0
    Messages:
    109
    Likes Received:
    28
    Trophy Points:
    41
    Simple math folks!

    1080 screen has total pixel count of 2073600 (1920 x 1080)
    3K screen has total pixel count of 4665600 (2880 x 1620). So ratio is 2.25:1 when compared to 1080
    4k screen has total pixel count of 8294400 (3840 x 2160). So ratio is 4:1 when compared to 1080

    Now imagine you are in a 4K screen looking at 1 black pixel which is of, say, square shape. When you reduce your resolution (downscale) to 1080, then 4 black square pixels are utilized to represent that same 1 black square pixel. Ultimately, you end up seeing 1 black square itself (4 squares arranged to look like one square) after downscaling. Hence no noticeable loss of image quality.

    In case of 3K screen, the 1 black square pixel needs to be distributed across 2.25 black square pixels when downscaled to 1080 resolution. How can you geometrically arrange 2.25 squares to look like 1 proper square? Not possible without some pixel adjustment (rendering) to make it look almost like a square. This is where the loss of quality comes from. It may not be visible to everyone as the panel packs millions of pixels on 15 inch screen. But few can notice the difference upon looking closely in form of blurriness or loss of sharpness.
     
    jaybee83 and ericc191 like this.
  32. LunaP

    LunaP Dame Ningen

    Reputations:
    946
    Messages:
    999
    Likes Received:
    1,102
    Trophy Points:
    156
    What about interpolation etc since some screens are crappy about it, as for Cake I think he was referring to 2011x1158 or w/e the resolution just above 1080p was. That and how close to the screen do you really need to be to see it.
     
  33. b.j.smith@ieee.org

    [email protected] Notebook Consultant

    Reputations:
    303
    Messages:
    279
    Likes Received:
    175
    Trophy Points:
    56
    This is all off-topic ...

    Obviously you haven't worked with me. I laugh at genuine wit.

    One doesn't have to show knowledge or intelligence to crack a great joke. I mean, Blue Collar TV and related redneck stuff is a heck of a lot better.

    The problem with the Big Bang Theory is it's all fake geeks at work. They force inaccuracies through jokes as part of the punchline. I wish they just wouldn't go there.

    I honestly don't find anything on that program funny. Even though I don't like sitcoms, any time they don't just use sexual innuendos and cutdowns constantly to make a punchline (even the short-lived Dilbert sitcom made that mistake too often), I'm impressed.

    Seinfield was a great example of how to write a sitcom that didn't fire off constant sexual innuendos, cut downs and or insult the intelligence of their viewership.

    But every time someone has Big Bang Theory on, I just cannot laugh. And I usually have to leave the room if I cannot turn it off. It's honestly that bad. I'd rather watch a reality show, and I really dislike reality shows. ;)

    Traditional engineer? Or technology professional? That's the difference.

    I didn't take offense to your teasing. I only take offense to how bad Big Bang Theory is. ;)

    And Shelton is a total asshat. I don't find him funny at all. I just cannot laugh at it, and his faux knowledge. I also don't know anyone like him in the real world, sans maybe some teenagers, maybe a couple of interns, but they grow out of it by the time they reach Sheldon's age. I guess that's why I just absolutely don't like the Sheldon character.

    I constantly work in a customer-facing role, and I've never had to deal with a Sheldon.
     
  34. b.j.smith@ieee.org

    [email protected] Notebook Consultant

    Reputations:
    303
    Messages:
    279
    Likes Received:
    175
    Trophy Points:
    56
    It's virtually impossible to take a good photo to show pixellation, because the photo itself is pixellated. Simple optics at work.
     
  35. Addy246

    Addy246 Notebook Consultant

    Reputations:
    0
    Messages:
    109
    Likes Received:
    28
    Trophy Points:
    41
    Interpolation (up scaling) will always result in loss of quality and things do look crappy. Considering same example of squares, you have 1 square at 1080 resolution and upscaling would have to fill 4 squares in 4K resolution. But the pixel information (color black) is stored in only one square, so how does this get distributed across 4 squares when upscaled? What info would be filled in 3 extra squares? Again, some rendering is required here but quality is not very promising compared to downscaling. This is why images look all pixelated when up-scaled. But this also depends on how the application is designed for upscaled resolutions. I have heard (not seen) that few MS applications do well when upscaled in 4K. Basically, the image or game or app or whatever you want to upscale should be designed for higher resolution 4K. Then you will not notice any loss when viewing at either 1080 or 4K.

    For down scaling to non-native resolutions, at normal distance, you will not notice the difference at all. But if you can see thru a lens, maybe you can notice. Try this - type letter "I" on notepad on higher resolution, then downscale to non native resolution and view with lens again, you will see some grey pixels (blurriness) around the periphery of "I" letter.
     
  36. Addy246

    Addy246 Notebook Consultant

    Reputations:
    0
    Messages:
    109
    Likes Received:
    28
    Trophy Points:
    41
    Dabeer - you ordered your P650SE from LPC Digital right? was the PSU in the spec at the time of order of 180W rating? I now see 150W being offered for this laptop. :(
     
  37. Dabeer

    Dabeer Notebook Evangelist

    Reputations:
    357
    Messages:
    633
    Likes Received:
    204
    Trophy Points:
    56
    Definitely LPC Digital... and it was 100% 180W when I ordered... but now that you mention it, I'm looking at the email with the order, and it doesn't include the AC adapter as a line item. I really hope it arrives with 180W, otherwise that's going to be extremely uncool.

    I just went back to look at the site, and I see no mention of the AC adapter at all, at least not in the configurator. I'm pretty sure it was there when I ordered. Weird. I know other config items have changed, like the 7260 got upgraded to 7265, but still... that's something they should have told me about.

    Edit: Gotta love the small print: Specifications are subject to change without notice

    Edit 2: The Sager site still says 180W, so I'm hopeful.

    Edit 3: I've sent an email to LPC Digital asking for clarification, I'll let you know what they tell me.
     
    ericc191 likes this.
  38. Addy246

    Addy246 Notebook Consultant

    Reputations:
    0
    Messages:
    109
    Likes Received:
    28
    Trophy Points:
    41
    Thanks for the update. On LPC's, I remember seeing choice for spare PSU of 180W on configuration page, but there is no such option offered. All I see is the power rating of PSU in main spec page which reads 150W. Xotic and Powernotebooks, however, are offering 180W on configuration page for spare PSU, plus they also show you are getting 180W stardard PSU on configuration page. 180W is also mentioned on their spec page.

    What confuses me is all the orders coming from resellers are fulfilled by Sager, it is Sager who assembles the laptop and ships out from one location - CA. Then why different specs from different resellers?

    Or is it that resellers website is not reflecting the correct specs and one needs to refer to Sager's spec on their website to know what they can expect?
     
  39. Addy246

    Addy246 Notebook Consultant

    Reputations:
    0
    Messages:
    109
    Likes Received:
    28
    Trophy Points:
    41
    Cool, thanks!
     
  40. tfast500

    tfast500 Notebook Consultant

    Reputations:
    1
    Messages:
    179
    Likes Received:
    71
    Trophy Points:
    41
    Assuming all sager will be shelled to the retailers where they will install the ram, HDD, OS and any other special services, monitor calibrations and thermal paste upgrades that the customer orders
     
  41. Dabeer

    Dabeer Notebook Evangelist

    Reputations:
    357
    Messages:
    633
    Likes Received:
    204
    Trophy Points:
    56
    I'm confused by this as well. The fact that many resellers were listing the AC-7260 while Sager and others were listing the AC-7265 makes me think that many resellers were posting a spec sheet that had been provided a lot earlier, and were not getting the latest specs from Sager. I want to think that LPC had a very early spec sheet, and copied a part that they knew got updated (the WLAN card) but missed the update to the AC adapter. If and when they respond to my email, maybe they'll update the site with the full latest spec sheet.
     
  42. Dabeer

    Dabeer Notebook Evangelist

    Reputations:
    357
    Messages:
    633
    Likes Received:
    204
    Trophy Points:
    56
    I think the RAM, HD, and OS are actually installed by Sager at "the factory", and it's only shipped to the reseller for things like monitor calibration and thermal paste upgrades, otherwise it's shipped directly to the customer. But of course a reseller could confirm this for us...
     
  43. Addy246

    Addy246 Notebook Consultant

    Reputations:
    0
    Messages:
    109
    Likes Received:
    28
    Trophy Points:
    41
    If you go thru the FAQ section on Xotic's website and look at their 5th question re: different phases of an order, it mentions that if the laptop is Sager, it is Sager who assembles it from the ground up. Only in case of customization (paint job, laser etching, etc) does the finished laptop get shipped out to Xotic's location first. LPC has no customization to offer, so finished laptop would come directly from Sager's location, with exception for CA residents (where laptops would go to LPC's office in NV first and then ship back to CA to avoid taxes, but at their own additional shipping expense)
     
  44. tfast500

    tfast500 Notebook Consultant

    Reputations:
    1
    Messages:
    179
    Likes Received:
    71
    Trophy Points:
    41
    Weird I figured since it seems so many HDD configurations, promotions that was up to the shop
     
  45. Splintah

    Splintah Notebook Deity

    Reputations:
    278
    Messages:
    1,948
    Likes Received:
    595
    Trophy Points:
    131
    From what I understand screens do not check whether or not the downscaled resolution is divisible by the native resolution and will tender it in the same blurry manner

    Sent from my Nexus 5 using Tapatalk
     
  46. Addy246

    Addy246 Notebook Consultant

    Reputations:
    0
    Messages:
    109
    Likes Received:
    28
    Trophy Points:
    41
    Well, if you go look at Sager's own website and try to configure, they are offering the same HDD options (and same rates/promotions) compared to reseller's config page. Just the base price is slightly higher on Sager's page.
     
  47. Dabeer

    Dabeer Notebook Evangelist

    Reputations:
    357
    Messages:
    633
    Likes Received:
    204
    Trophy Points:
    56
    In the simplistic case of a single black pixel, it's likely that the result will be 4 black pixels. In general, though, it is true that the image will be interpolated and the results will be slightly more blurry than the original.

    Also, I think what was being described was upscaling, not downscaling - when 1 pixel at 1920x1080 gets scaled up to 4 pixels at 3840x2160.
     
  48. Dabeer

    Dabeer Notebook Evangelist

    Reputations:
    357
    Messages:
    633
    Likes Received:
    204
    Trophy Points:
    56
    Great news - LPC Digital has confirmed that the AC adapter is 180W, and that the website was merely out of date, but has now been updated with the correct specification.
     
    tfast500 and Addy246 like this.
  49. Addy246

    Addy246 Notebook Consultant

    Reputations:
    0
    Messages:
    109
    Likes Received:
    28
    Trophy Points:
    41
    Thanks for letting us know! I was planning on ordering from LPC, so it is a relief to know it is indeed 180W! :)

    Edit: I confirm their website is updated as well. Great job Dabeer! :thumbsup:
     
    Dabeer likes this.
  50. ericc191

    ericc191 Notebook Evangelist

    Reputations:
    174
    Messages:
    305
    Likes Received:
    39
    Trophy Points:
    41
    Yeah this would have been a big deal for me too. I plan to make my purchase with LPC-Digital again since they did such a good job on my last notebook.

    However, I always assumed they built them from the base Clevos. Like they had them at their shop ready to be customized. If they all just come from Sager to us (without choosing any major customizations), then the warranty is the only differentiating factor? Could a reseller please address this so we can know how it works.
     
← Previous pageNext page →