The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.

New M6500 Discussion Thread

Discussion in 'Dell Latitude, Vostro, and Precision' started by Quido, Dec 1, 2009.

  1. asalcedo

    asalcedo Notebook Consultant

    Reputations:
    1
    Messages:
    111
    Likes Received:
    3
    Trophy Points:
    31
    Hello debguy,

    I had already read most, if not all, your posts on this matter.

    That, plus my other contradicting inputs, keep me confused still.

    I have at the moment a dual core unit and have ordered a quad core one two.

    I am reluctant to open up my dual core unit because I may return it and don't want to risk damaging it. If I do, I will certainly report here.

    I am inclined to believe that some boards came indeed with two slots initially, and now they are all four slots. Perhaps, because of the initial memory issue the dell rep of the chat above mentions.

    I'll keep you posted with my progress..
     
  2. loganan

    loganan Newbie

    Reputations:
    0
    Messages:
    6
    Likes Received:
    0
    Trophy Points:
    5
    Bokeh-
    Did you have the RGB LED display, or the standard display? I'm having an issue w/ really intense saturation...
     
  3. nikhil170

    nikhil170 Notebook Geek

    Reputations:
    0
    Messages:
    94
    Likes Received:
    0
    Trophy Points:
    15
    Folks,

    Is there a thread on how to calibrate the colors properly and how to test the display for color gamut. I have the Covet and the 'Red' looks particularly great.. Some would say it's too Red.. but imo that's bloody (pun intended) awesome..

    And I've an annoying issue with the FX 3800. I've been playin L4D with all settings set to MAX at 1200P and it seems to be handling the game well. But at certain points in the game the FPS go down to ~15 from 80FPS.. Usually when the smoke kicks in.. This problem goes away when the shaders are set to Medium. I'm running the latest drivers from nvidia... Anyone having similar issues... I'm gonna be trying L4D2 tonight.. And report back on the results..

    Ty,
    Nik
     
  4. Bokeh

    Bokeh Notebook Deity

    Reputations:
    1,330
    Messages:
    1,777
    Likes Received:
    259
    Trophy Points:
    101
    I have the RGB LED. We measured and calibrated and generally talked most color issues to death last January in this thread. If you think your colors are too intense, pull the digital vibrance back to 45% or turn down the brightness to 7 or 8 out of the max brightness of 15. TheZoid also posted ICC calibration curves that I made on his m6500 and mine.

    The issue with the RGB screen is not really an issue. You have a screen that will display a very wide gamut of colors, but will also go up to around 300 nits of brightness. Once you get past around 180-200 nits the colors start going past neutral and start looking a little glowy. The panel is simply doing what you ask it to do - get really bright without the colors crushing down in intensity.
     
  5. Crimsoned

    Crimsoned Notebook Deity

    Reputations:
    268
    Messages:
    1,396
    Likes Received:
    3
    Trophy Points:
    56
    Hi Nik let me help you out with an explanation of sorts.

    Mobile GPU's utilize pipelines (pixel/vertex) these are largely difficult to utilize efficiently in games however they are more efficient in terms of heat output etc and doing CAD or Design or computations (statistics, etc).

    That plus mobile workstations have less pipelines then desktop gpu's (well desktop gpu's use stream processors which are more flexible for both gaming/physics etc), you get a mix of a game utilizing most of the pipelines for other tasks like geometry, etc and less pipelines for things like physics (gases/smoke etc require a lot of computational power which is being used up by games).
     
  6. jeremyshaw

    jeremyshaw Big time Idiot

    Reputations:
    791
    Messages:
    3,210
    Likes Received:
    231
    Trophy Points:
    131
    Pipelines have nothing to do with moble or not.

    In the past, this was mostly driven by DirectX specifications (from the consumer side), but Pixel shaders and Vertex Shaders were used from DX5-DX9c. Then, DX10 introduced another shader - geometry, and ATi and nVidia both saw it fit to instead create a single unified shader with shared execution resources instead of three seperate shaders with their own execution resources - it simply saved die space and lowered the overall treansistor count (thus saving power and reducing heat output as a direct result). The new issue with mobile GPU is their desktop cousins. The nVidia GTX580 will take some mindblowing 250W at max load, and the HD6970 will take the same amount, too. Even very large laptops will have trouble dissapating more than 150W of heat - and laptops must consider the GPU, CPU, chipset, HDD, RAM all dissapate some amount of heat. And as a result, laptop parts are normally downclocked, power gated - whatever it takes - to fit within a lower power usage/heat output profile. An unfortuneate side effect are the laptop parts are often significantly slower than their desktop counterparts.

    However, on a lower hardware level, laptop and desktop GPU are identical - save clock speeds and enabled execution resource counts.
     
  7. Crimsoned

    Crimsoned Notebook Deity

    Reputations:
    268
    Messages:
    1,396
    Likes Received:
    3
    Trophy Points:
    56
    Excuse me for not explaining it so technically but in the end you are arguing my point. The difference between desktop/GPU are in the amount of pipelines or stream processors, as well as lower clock speeds. This is done precisely for thermal reasons.
    The limited pipelines is why gas/fog/smoke etc in games can really bog down a laptop due to the heavy use of physics calculations. Now take games which have been adapted to use physx, and you have a recipe for a seriously bogged down computer (desktop or laptop).
    This is why Nvidia's Physx is sometimes wanted, so that a secondary dedicated GPU could handle the physics aspect of games thus reducing the overall load on the CPU/GPU. In laptops of course you do not have this, unless you run SLI systems with Physx enabled GPU's.

    Remember unlike other types of processing, things that relate to physics are 1 calculation at a time, and cache becomes useless since no two physics problems should be the same. Unlike other more variable processes, physics calculations will bog down a pipeline for a single calculation with no caching available. Now put a system that requires 100% of the GPU for rendering of the game, and try to allocate 20% of the GPU for physics, and you get a slow down of the system. Of course in physics intensive scenarios, that usage can easily go past 20%.

    On a separate note I just installed L4D2 on my M6500 using a FX 2800m, I did not notice any bogging down of the system during smoke, fire, and heavy lighting scenarios. This is running at 1920x1080 (external monitor), and all high settings+4xAA, 8x AF, and triple buffering. So I am going to guess that the person's system may be having separate issues.
     
  8. nikhil170

    nikhil170 Notebook Geek

    Reputations:
    0
    Messages:
    94
    Likes Received:
    0
    Trophy Points:
    15
    Yeah - I'm going to be trying L4D2 and see the results. Could be L4D specific you know.. that's the thing though.. 3d Mark results are so pointless - they can't dictate gameplay gameplay quality.. However, I don't V-Sync.. I'll try V-Sync on next time in L4D and just try L4D2 and reply back..
     
  9. nikhil170

    nikhil170 Notebook Geek

    Reputations:
    0
    Messages:
    94
    Likes Received:
    0
    Trophy Points:
    15
    K just finished Sacrifice - with all settings max + 16XAF + 4xMSAA v-sync off @ 1200p and the game was better at maintaining frame rates.. going to try same settings n L4D 1..
     
  10. asalcedo

    asalcedo Notebook Consultant

    Reputations:
    1
    Messages:
    111
    Likes Received:
    3
    Trophy Points:
    31
    Hello Bokeh,

    I need with urgency a color profile for an M6500 with an RGB LED screen.

    According to my Dell order my screen is

    P8GC1 Module,Liquid Crystal Display,17WU,Red/Green/Blue,Samsung,M6500

    According to Device Driver the Monitor Hardware ID is LGD0241

    And according to HWiNFO32 the monitor name is DR740171WU5


    I do not have with me now my x-Rite i1 profiler and need to do some color critical Photoshop work these holidays AND I need to decide whithin four days if I keep this M6500 or not.


    Based on my experience with my M6400 with RGB screen, monitor Hardware ID LGD018E, with the i1 profile the results were amazingly good. But a profile is a must and not all screens profile as well.



    I have searched this thread going back till May, but I have not found the ICC profiles that you mention that TheZoid published. A "borrowed" profile will not be as good as a profile done for the my specific screen but, if done for the same make and model, it should be very close.

    I do know for sure that the monitor in my M6500 is RGB, one just has to look at the saturation of the magentas and greens.

    Many thanks.

    Antonio
     
Loading...

Share This Page