The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.

Vostro 1400 overclocking, its crazy.

Discussion in 'Dell Latitude, Vostro, and Precision' started by Zer0N1nja, Aug 9, 2007.

Thread Status:
Not open for further replies.
  1. ChaosKye

    ChaosKye Notebook Consultant

    Reputations:
    5
    Messages:
    222
    Likes Received:
    0
    Trophy Points:
    30
    I'd have to agree that this isn't a safe idea to overclock to the levels that you guys were in the earlier post. I'm just curious how far I can clock w/ minimal increases in temperature
     
  2. Rowen

    Rowen Notebook Consultant

    Reputations:
    35
    Messages:
    286
    Likes Received:
    0
    Trophy Points:
    30
    Well what's a minimal increase. We don't even know which temp utility to trust so as to set a baseline temp to measure against.
     
  3. ChaosKye

    ChaosKye Notebook Consultant

    Reputations:
    5
    Messages:
    222
    Likes Received:
    0
    Trophy Points:
    30
    Well, I was hoping at first to get something safe that was around what wikipedia says the card should be clocked to (400/600) but I don't know if it'll be that stable long term. I'm doing everything based on relative temperature increases. Since the readings are giving me 96 on ATItools under a full load at stock settings, I probably won't use any clock settings that push it above around a 102 on full load. I'm getting 47 for the GPU with stock under load, so I'll probably try to stay under 50 under a full load for clocks etc. Watch all your temperature settings. ATI tools has an artifact test as well. You can use it to check your system stability for an extended period of time at a certain clock speeds.

    I was wondering if someone with experience w/ AS5 could give insight if that would be a smart idea with this laptop. I also thought it was just for CPUs, does it work for GPUs as well or something?
     
  4. chuck232

    chuck232 Notebook Deity NBR Reviewer

    Reputations:
    274
    Messages:
    1,736
    Likes Received:
    1
    Trophy Points:
    55
    AS5 works fine for GPUs; however the problem with many laptop GPUs is that they're sandwiched under a thick thermal pad. As a result, the amount of thermal paste you'd have to put on there is extreme and would probably result in worse temperatures. I know that was the case with my old ASUS Z71V. I had to mod the heatsink to get proper contact after I removed the thermal pad.
     
  5. devilsnight

    devilsnight Notebook Geek

    Reputations:
    0
    Messages:
    87
    Likes Received:
    0
    Trophy Points:
    15
    chuck is right. with the 1400 there is a gap between the gpu and the heatsink which is filled with a thermal pad and a copper shim.

    i ordered different thermal pad for the Chipset to see if that i can lower temps that way. Im confused which program is reading the temperatures correctly, why would i8kfan read the temps for the gpu under chipset.... this is fairly messed up. Ive taken apart my laptop several times, and am going to do it again once my stuff comes. i just gotta buy a copper shim for my future modding.
     
  6. zragnarok2

    zragnarok2 Notebook Enthusiast

    Reputations:
    0
    Messages:
    17
    Likes Received:
    0
    Trophy Points:
    5
    rivatuner says my base clocks are 170mhz for the core and 100mhz for the mem.. wtf (8400gs btw)
     
  7. ChaosKye

    ChaosKye Notebook Consultant

    Reputations:
    5
    Messages:
    222
    Likes Received:
    0
    Trophy Points:
    30
    There are 3 different modes for your video card. They should go (170,100) for 2D, (275,300) for low power 3d, and (400,500) for high power 3D. Also the 162.xx series driver is not good for OC if you're using that
     
  8. devilsnight

    devilsnight Notebook Geek

    Reputations:
    0
    Messages:
    87
    Likes Received:
    0
    Trophy Points:
    15
    alright guys im back with someting interesting. I took a hair dryer yes a hair dryer, took off my keyboard and took off the back side of laptop where ram is.

    next to the ram is the GPU heatsink, i loaded u i8kfan and watched the temps.

    first i pointed the hairdryer blowing hot air into the ram area, to the GPU and watched the GPU temps rise...yes the GPU under the i8kfan, it showed my GPU temps going up but my chipset temps staying the same.

    then i took the hair dryer and pointed it at the top of the lapto, under the keyboard right above where the thermal pad for the chipset is, and watched the temps of the chipset go up!!!. along side the chipset temps i noticed my gpu temps go up too.

    I believe I8kfan is reporting the correct temps, rivatuner, ati tool, and ntune, seem to report the chipset temp as the GPU temp. So far i think that the Chipset gets extrememly hot and the heat carries over to the GPU.

    i know this is just me rambling but try what i tried and see for yourself. i think its the intel chipset which is running hot. The GPU itself seems to be cooled quite well!!!


    thats my 2 cents ill keep testing...damnit i got addicted to this thread/
     
  9. devilsnight

    devilsnight Notebook Geek

    Reputations:
    0
    Messages:
    87
    Likes Received:
    0
    Trophy Points:
    15
    what cooling options are available to cool an intel chipset? anywhere on intels site does it say what temps it gets?
     
  10. ChaosKye

    ChaosKye Notebook Consultant

    Reputations:
    5
    Messages:
    222
    Likes Received:
    0
    Trophy Points:
    30
    That was a very interesting experiment you did lol, and it cleared up some questions which is good to know. Too bad... there is still the heat issue. Really hot, whether GPU or not, is not a good thing.
     
Thread Status:
Not open for further replies.

Share This Page