The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.

M4800 Owner's Thread

Discussion in 'Dell Latitude, Vostro, and Precision' started by changt34x, Oct 29, 2013.

  1. smvb64

    smvb64 Notebook Enthusiast

    Reputations:
    0
    Messages:
    14
    Likes Received:
    2
    Trophy Points:
    6
    Oh wow, that manual is an invaluable resource, thank you.
    You just saved me around $120, was getting ready to bring it in.
    I'll let you guys know the end result.

    Cheers,
    Dan
     
    alexhawker likes this.
  2. alexhawker

    alexhawker Spent Gladiator

    Reputations:
    500
    Messages:
    2,540
    Likes Received:
    792
    Trophy Points:
    131
    Glad it was helpful. Dell should have this kind of manual for every machine. Their part numbering scheme (or seeming lack of one) leaves a lot to be desired, but the manuals are pretty good.
     
  3. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,689
    Trophy Points:
    581
    I'll be leaving the M4800 scene once again. :(

    Put mine up for sale.

    These are some nice machines even today. I tried out the newer Precision and was not a fan.
     
  4. alexhawker

    alexhawker Spent Gladiator

    Reputations:
    500
    Messages:
    2,540
    Likes Received:
    792
    Trophy Points:
    131
    Since you're not a fan of the newer Precision, what are you replacing it with, if you don't mind me asking? Mine is showing it's age; it's not bad around the house, but I'm less a fan of lugging it around now than when I got it.

    I received a pixelbook as a gift which is a nice thin/light option, but not the same obviously, so I might be in the market for something new and it would be nice to hear what options you're looking at.
     
    ssj92 likes this.
  5. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,689
    Trophy Points:
    581
    I use my Alienware Area-51m. It basically has desktop 9900K and RTX 2080.

    If I am home I use my Desktop: Xeon E5-1660 v3 & Titan Xp
     
    alexhawker likes this.
  6. bitbang3r

    bitbang3r Newbie

    Reputations:
    0
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
    I have a first-gen m4800 (FHD panel, i7-4800mq) with USB 2 dock and K2100m video card.

    I'm presently using it with two external monitors (both 1920x1080@60hz, connected using DVI-1 and DVI-2 on the dock), and the built-in FHD panel as monitor #3 (connected to the K2100m via Optimus).

    Now that they're fairly cheap, I'd like to upgrade my primary external monitor to one that can do 3840x2160@60hz via DisplayPort and end up with the following:

    • Laptop's built-in FHD panel, running 1920x1080@60hz as my "left" monitor.
    • New 3840x2160 monitor running @ 60hz via DisplayPort, as my middle/main monitor
    • Old 1920x1080 monitor running @ 60hz via DVI-2 as my "right" monitor
    ... or, possibly, use the laptop with the display closed and disabled (when docked), and my two original monitors connected to DVI-1 and DVI-2, with the new 4K monitor sitting between them and connected via Displayport.

    Can I actually DO this, or is there some insidious limitation imposed by Windows, the K2100m, and/or the way Dell wired everything together that would make this brittle or impossible to do?
     
  7. Aaron44126

    Aaron44126 Notebook Prophet

    Reputations:
    879
    Messages:
    5,553
    Likes Received:
    2,076
    Trophy Points:
    331
    Back when I used a M4800 at work, I actually had a setup like this, two 4K monitors attached to the system with K1100M video card. It does work. The DisplayPort connection through the dock can be finnicky though. It would sometimes want to drop down to 30 Hz instead of 60 Hz for the 4K displays. (4K wasn't a thing when these docks were originally released.) Attaching to the DisplayPort directly on the system was always solid, but there is only one of those. (This system can only do 4K at 30 Hz over the HDMI port.)
     
  8. bitbang3r

    bitbang3r Newbie

    Reputations:
    0
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
    Oh, wow... all this time, and I never even noticed that the ports under the DVI ports on the dock were DisplayPort. I always just assumed they were HDMI ports, and that there was a passed-through DisplayPort connector somewhere on there that duplicated the one on the side of the laptop.

    From the research I just did, it looks like the pair of DisplayPort ports on the USB 2 variants of the e-dock are DP1.1, but the ones on the USB 3 variants are DP1.2. Can I assume that the dock YOU were using was the USB 3 variant?

    I did come up with what seems like it might be a plausible theory about how you might have gotten two 4k monitors to work at 60hz... but it's purely speculative at this point:

    1. Assumption: a GPU can have only a single root DisplayPort, and the best DisplayPort standard that existed in 2013 was DP1.2, which has only enough bandwidth to feed a single 4k monitor at 60hz.

    2. Guess: DisplayPort #1 (under DVI-1 on the dock) and the m4800's right-side DisplayPort are hardwired to the Intel graphics subsystem. I doubt whether it's a proper DisplayPort hub, given how absurdly expensive they were until VERY recently. My guess is that they just used a simple crossbar circuit to detect a cable in one or the other (giving preference to one if both have cables) & tri-stated the wires to the inactive port. DisplayPort #2 (under DVI-2 on the dock) is hardwired to the graphics card (the K2100m, in this case).

    3: Pure speculation: since the K2100m card was an expensive premium option, it MIGHT actually have a real onboard DisplayPort hub. In this case, DisplayPort #1 on the dock might be switchable so that it's connected to Intel graphics when DVI-1 on the dock is connected to Intel graphics (using Optimus to patch through K2100m graphics), and connected to the K2100m's DisplayPort hub when DVI-1 is connected directly to the K2100m.

    Theory of operation:

    When you were able to successfully get two 4k monitors, connected to DP#1 and DP#2 on the dock, to work at 60hz, the monitor connected to DP#2 was directly driven by the K2100m via DisplayPort, and soaked up basically all of its DP1.2 bandwidth. Meanwhile, the monitor connected to DP#1 was directly driven by the Intel Graphics, patched through to the K2100m via Optimus.

    Although the two collectively soaked up way more bandwidth than the K2100m's single root DisplayPort could supply, that was OK, because the "Optimus" connection between the K2100m and Intel graphics was independent of it.

    My theory is that the occasions when you saw it drop the framerate from 60hz to 30hz probably occurred because something caused Windows/BIOS/m4800/god-knows-what to switch DisplayPort #1 on the dock from Intel/Optimus to K2100m-DP-hub.

    Does that sound plausible, or am I misunderstanding something major about the m4800's theory of operation? My biggest uncertainty is my assumption that a given GPU can have only a single root DisplayPort, and that it's subject to the same bandwidth limits as the DisplayStandard of its downstream ports. If it turns out that there's no reason why a single card like the K2100m can't have two or more totally independent DisplayPorts without involving a hub, or if the DP1.2 bandwidth limit only applies to the child ports of the hub (so it could have a DP1.2 hub with root port that has enough bandwidth to fully saturate two child DP1.2 ports), then most of my theory of operation probably goes flying out the window. On the other hand, if I'm basically right about how DisplayPort and Optimus works, I can't really think of any other way the setup you described (two 4k monitors at 60hz) could work with a m4800 besides offloading one of the DisplayPorts to Intel graphics & patching the k2100m through to it via Optimus.

    ---

    Actually, I can think of another plausible way... but it would have probably been brittle & temperamental. If the Intel graphics and K2100m can (theoretically, though Windows probably begs to differ) operate simultaneously and independently of each other, you might have been able to have the monitor connected to DP#1 on the dock driven ENTIRELY by Intel Graphics, without involving the K2100m or Optimus at all.

    That said, I've gotten the distinct impression that Windows is known to be "unstable" (cough, cough) whenever you try using two non-identical graphics cards in a multi-monitor desktop-extended usage scenario.

    From what I recall (admittedly, circa 2014), if you're playing DRM-protected video in a window and try dragging it to the other screen, the moment the content window gets simultaneously touched by both GPUs, the app and one or both video drivers will crash. At best, Microsoft might have patched it up so that it can now gracefully recover instead of having everything go down in flames and crash (say, turning into a black box as it enters monitor #2 and during the new key exchange, then recovering and becoming decrypted in the new window once the new key exchange is complete).

    Another scenario I remember being problematic involves GPU-accelerated 3D graphics. From what I recall, if you have a heterogenous-GPU system, Windows only allows fullscreen windows to use GPU acceleration. Otherwise, it forces software-rendering. The fundamental problem is that things like pattern buffers are local to the videocard, so unless you actively go out of your way to ensure that BOTH GPUs have access to synchronized copies of the same data, the moment something crosses a screen boundary to the other card, you're going to end up with anything between an ugly glitch and an outright crash. I think THIS is the main reason why Unity's IDE allows you to run it full-screen... if you didn't, and you were running on a multi-GPU system, Windows wouldn't use GPU acceleration for the IDE itself. I also remember reading about lots of insidious bugs related to this, where developers had Windows suddenly start refusing to use/allow GPU acceleration in full-screen preview windows.

    AFAIK, Unity has historically considered them to be low-priority bugs, because modern videocards have enough bandwidth to single-handedly drive multiple 4k monitors, and most people who DO have multiple cards are using SLI (which mostly sidesteps the problem). That said, I think bugs like those are being seen a lot more often now, because more and more developers are switching from desktops to high-powered laptops and encountering problems like these for the first time. I know I personally got bodyslammed for almost a week when I had my first encounter with Optimus... I thought I had it disabled, but the first time I tried using Unity undocked and without my extra monitors, Windows insidiously re-enabled it & used Intel graphics instead of the K2100m. I went to show off the game I was working on, and couldn't figure out why it inexplicably looked so awful. Ugh.
     
    Last edited: Jun 16, 2020
  9. Aaron44126

    Aaron44126 Notebook Prophet

    Reputations:
    879
    Messages:
    5,553
    Likes Received:
    2,076
    Trophy Points:
    331
    No, I was using a dock with USB2 ports.

    There is very little in the way of electronics in the dock. It is more just a giant "adapter" that arranges the signals coming out of the port on the bottom of the PC into ports that you can plug stuff into. It just passes through pins from the connectors to pins on the bottom side of the system where they are hooked up directly to the motherboard. It's a pass-through port replicator. Now, there would be a difference between USB2 and USB3 because USB3 ports have extra pins, which would need to be added to the connectors and wired up. However, DisplayPort doesn't add pins between versions so the DisplayPort version is actually determined by the capabilities of the laptop, not the dock.

    I always figured that because the USB2 version of the dock wasn't intended to be transferring the amount of data required by a 4K stream over those ports, something was flaking out in the connection and the PC was automatically switching to a lower-bandwidth display mode. Maybe the USB3 version of the dock has "wiring" connecting the ports changed to be more robust so there would be less of an issue hooking up 4K displays.

    Regarding your "2" guess about which GPU the ports are wired to. You can actually configure this. One of them is always wired to the NVIDIA GPU but the other can be switched between the NVIDIA GPU or Intel GPU. It's not automatic switching. The toggle is in the BIOS setup, in the video section, right next to where you can toggle graphics switching on or off. It's called "DP dock display mode" or something like that. (It applies to one of the DVI ports as well. One of them is user-switchable. You can't use the corresponding DisplayPort and DVI port at the same time. I'm guessing that there is just a DisplayPort-to-DVI adapter inside the dock which powers the DVI ports off of the same signal.)


    I don't remember the specifics too much of this setup. I didn't use the M4800 in this configuration for very long. I do remember something about the refresh rate dropping when a second display was connected so maybe there is something to your DisplayPort hub/bandwidth limitation idea. I spent a while with the Precision 7510 in this configuration (with the same dock) and I believe that I do remember finding it best for some reason to have one port running off of the NVIDIA GPU and the other one running off of the Intel GPU, like you have suggested here. I eventually moved one of the displays to connect directly to the laptop so that I could have everything running off of the Intel GPU, which was totally capable of driving three 4K displays (two external + one internal) all at 60 Hz.


    Anyway. I've gotta believe that the K2100M can run more than just one 4K display at 60 Hz on its own. You could order the M4800 with a (internal) 4K display. Optimus was locked out in this configuration, so you'd be using the NVIDIA GPU only. The internal display in this case would have an eDP connection. If it couldn't run any external displays in this configuration (without breaking 60 Hz), I'm sure that there would have been loads of complaints on this forum.
     
    Last edited: Jun 16, 2020
    alexhawker likes this.
  10. alexhawker

    alexhawker Spent Gladiator

    Reputations:
    500
    Messages:
    2,540
    Likes Received:
    792
    Trophy Points:
    131
    I think only the model with the 3200 x 1800 had Optimus locked out. IIRC, the 4K model (which came out a fair bit later) was capable of Optimus.
     
Loading...

Share This Page