The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.

Dell Precision 5510 Owner's Lounge

Discussion in 'Dell Latitude, Vostro, and Precision' started by Bokeh, Nov 24, 2015.

  1. jefflackey

    jefflackey Notebook Evangelist

    Reputations:
    96
    Messages:
    352
    Likes Received:
    38
    Trophy Points:
    41
    One question: When I see a 5510 config with the Quadro GPU and another with the Intel 530 GPU - I thought the 530 was integrated in with the Intel CPU? Is the Quadro GPU in addition to the 530 and can you change between them? Or does it somehow replace the integrated graphics?

    Thanks
     
  2. John Ratsey

    John Ratsey Moderately inquisitive Super Moderator

    Reputations:
    7,197
    Messages:
    28,839
    Likes Received:
    2,158
    Trophy Points:
    581
    The Nvidia Optimus configuration is quite clever: The Intel GPU always drives the output devices while the Nvidia GPU leaps into action when needed as a graphics co-processor and then goes back to sleep when not needed (with negligible impact on the the battery usage). It's not the switchable graphics of past times which sometimes caused headaches. The Nvidia GPU also supports CUDA which can be used to accelerate massively parallel non-graphical computations. There can be situations when the CPU and both GPUs are all working together although no individual component will be working at its best for very long due to overall power and thermal constraints. The SiSoftware Sandra GP Processing benchmark creates such a situation.

    I have read that there is a configuration of the 5510 which doesn't have the dGPU (and would save a few $$$s). In reality, that would serve my needs (provided that Dell hasn't dispensed with one of the fans) but I've not seen it on offer in my part of the world.

    John
     
    Last edited: Jan 1, 2017
  3. John Ratsey

    John Ratsey Moderately inquisitive Super Moderator

    Reputations:
    7,197
    Messages:
    28,839
    Likes Received:
    2,158
    Trophy Points:
    581
    Someone in the XPS forum provided a link to this page which mentions an Nvidia M2100 GPU. That GPU is also mentioned here. CES is imminent so more may be revealed within a few days (in which case it may soon be time to start a Precision 5520 thread).

    John
     
  4. TechCritic

    TechCritic Notebook Guru

    Reputations:
    0
    Messages:
    58
    Likes Received:
    5
    Trophy Points:
    16
    John,

    I'm curious, you describe the Nvidia GPU as a co-processor, I was under the impression that for a single monitor it was an either/or situation with the iGPU and Nvidia GPU - the Nvidia doesn't supplement the iGPU, it completely takes over temporarily and the iGPU shuts down temporarily. Then if using multiple external monitors, the iGPU can be used to drive the laptop's own display while the dGPU simultaneously drives the external monitors.

    Anyway when I hear the term co-processor, I think of an IC that supplements another CPU or GPU by offering hardware acceleration or power efficiency for a particular subset of applications that a CPU or GPU can accomplish itself less efficiently. A co-processor supplements the CPU or GPU rather than completely taking over all functions.

    I am not trying to correct you on the use of the term co-processor. I'm not entirely sure I'm using it correctly myself. I explained my conception of it because I'm wondering if the Optimus GPU setup in fact works differently than I thought.
     
  5. TechCritic

    TechCritic Notebook Guru

    Reputations:
    0
    Messages:
    58
    Likes Received:
    5
    Trophy Points:
    16
    Did you get an active adapter cable? Unlike HDMI and DVI which use different connectors but the same digital signal on each pin (no sound with DVI), this is not the case with Displayport and HDMI/DVI. HDMI to Displayport requires an active adapter with ICs in it to convert the digital signal. This is a much more expensive cable or cable/convertor. It'll probably cost you at least $50. I'm not sure if such an adaptor would degrade performance, but it's certainly possible. If at all possible, for cost and signal quality reasons I would avoid any setup that requires an active adaptor as most do not anyway. A DP++ Displayport on a computer can output HDMI or DVI with a passive cable as it converts the signal internally, but the same passive cables can't be used for HDMI or DVI on a computer to Displayport on a monitor. It is a common mistake for people to buy a passive DP to HDMI/DVI cable thinking they can use it in the other direction, you can't.

    It would make much more sense to use a passive USB-C or Thunderbolt 3 to Displayport cable. The cable is drastically cheaper and your don't risk degrading the signal through conversion (don't know if that's ever an issue here, but I wouldn't put it past third party manufacturers to put low quality ICs in the adaptor.)
     
  6. TechCritic

    TechCritic Notebook Guru

    Reputations:
    0
    Messages:
    58
    Likes Received:
    5
    Trophy Points:
    16
    I don't know if your monitor supports it, but I've read reports that the 5510's HDMI port can output 4K at 30Hz, so if that would be acceptable, you could could use HDMI. I'm not certain, but unless you're gaming, chances are you wouldn't notice the difference. You would probably need to change the monitors' refresh rate in the Intel/Nvidia graphics card UI or if there's no option there, I'd go through the monitor's settings (monitor buttons) and see if you can change the refresh rate there. It's possible that you monitor doesn't support 30Hz, but I think adding that ability should be trivial when it supports 60Hz, so it probably does.
     
  7. John Ratsey

    John Ratsey Moderately inquisitive Super Moderator

    Reputations:
    7,197
    Messages:
    28,839
    Likes Received:
    2,158
    Trophy Points:
    581
    This explanation of Optimus in Wikipedia is fairly good. In the iGPU + dGPU hardware configuration, the dGPU is basically a 3D rendering engine which is enabled when needed. all output and, I assume, all input is through the iGPU so the rest of the computer hardware only has one device to talk to so the challenges of graphics switching are much reduced. There are also notebooks where the iGPU is disabled and all graphics is handled by the dGPU. However, this tends to adversely impact on the battery time.

    The term "co-processor" came to my mind as my memories include the older Intel CPUs (486 and earlier) where the floating point co-processor was an extra cost option and its presence made a big difference in the speed of floating point computations but I think it's reasonable to use it for describing the Optimus configuration. However, I'm not sure it the system is clever enough to get both the iGPU and dGPU to concurrently do rendering using the shaders in each.

    John
     
  8. Aaron44126

    Aaron44126 Notebook Prophet

    Reputations:
    879
    Messages:
    5,553
    Likes Received:
    2,076
    Trophy Points:
    331
    Right... With NVIDIA Optimus, the Intel GPU is always driving the display. If the NVIDIA GPU is called upon for rendering work, its output is sent to the Intel GPU framebuffer and the Intel GPU displays it on the screen. You can choose which GPU is used by which programs on an app-by-app basis, in the NVIDIA control panel under "Manage 3D Settings" / "Preferred graphics processor." (The default is "auto-select" which means profiles set up by NVIDIA make the decision.)

    The Intel GPU is not "shut down" while the NVIDIA GPU is working, however the NVIDIA GPU can fully power off when it is not needed.
     
  9. BleachCake

    BleachCake Notebook Enthusiast

    Reputations:
    0
    Messages:
    19
    Likes Received:
    3
    Trophy Points:
    6

    Hi there,

    the thing with the HDMI connection is that the monitor itself can't display higher than 1080 resolution when hooked to an HDMI cable(read that on the EIZO page).

    i've connected it as outputting HDMI signal from the laptop,then using HDMI->DisplayPort adapter and DisplayPort cabble hooked in the EIZO.

    When using the DisplayPort connection the monitor is able to represent 1.07 billion colors(according to EIZO again).

    It does lag a lot when connected via the dell dock since as someone already mentioned this docking station is usb3 and the bandwidth of the signal is much less than the bandwidth needed for the DPort connection.This is greatly noticeable when cashing a preview of image sequence,for example.

    There is also a DVI-D connection available but it can not represent the 10 bit color space.

    About the 4K output from the Precision-don't have a 4K monitor so can't confirm about the refresh rate.

    bests,
    Ronnie


    Sent from my iPhone using Tapatalk
     
  10. ronelv

    ronelv Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5
    Hello, did anyone experience the notorious coil whine form the XPS series on the Precision 5510?
    Thanks
     
Loading...

Share This Page