The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.

Vostro 1400 overclocking, its crazy.

Discussion in 'Dell Latitude, Vostro, and Precision' started by Zer0N1nja, Aug 9, 2007.

Thread Status:
Not open for further replies.
  1. KitnaMiracle

    KitnaMiracle Notebook Enthusiast

    Reputations:
    0
    Messages:
    32
    Likes Received:
    0
    Trophy Points:
    15
    I'm using the stock drivers that came with my Vostro 1500, they are 101.28
    Apparently, they are pretty new drivers, and I have been able to overclock with them using RivaTuner. I've downloaded AtiTool but have not installed it for fear there would be complications with using 2 tools at the same time. It seems like AtiTool is easier to use with the hairy cube and the scanning feature.

    My 8400M GS came 400 core 400 memory at 3d performance.
    I've only overclocked to about 430/470 with about a 5-10 degree increase in temp. I've been using I8KfanGUI and rivatuner to track temp but the temps are off. I've heard I8KfanGUI has not been updated to be compatible with the new cards in dell, and it shows a lower idle temp, but a higher performance temp.

    I've been using 3dmark05 to track for abnormalities, nothing yet, but I dont know how many times to run it, especially with the limitations in the free version. Does anyone know if AtiTool's scanning and hair cube are sufficient tests for overclock stability?
     
  2. Triple_Dude

    Triple_Dude Notebook Evangelist

    Reputations:
    75
    Messages:
    589
    Likes Received:
    0
    Trophy Points:
    30
    Whoa! THAT I gotta see! Point me in the right direction, please :D. I've always wanted to OC my CPU... The 1.6Ghz T5470 just ain't cutting it anymore.
    Yep, the stock drivers are great for OC'ing :).
     
  3. StenLi

    StenLi Notebook Enthusiast

    Reputations:
    0
    Messages:
    23
    Likes Received:
    0
    Trophy Points:
    5
    Are you f... crazy ? stock driver? 101.28 is pretty new? try 163.44 ..u'll see what performance boost u'll get.. and when u need OC.. use 158.45..

    Triple Dude.. this is all about GPU not CPU.. we'r not OC ing CPUs..
     
  4. steveeb

    steveeb Notebook Guru

    Reputations:
    2
    Messages:
    63
    Likes Received:
    0
    Trophy Points:
    15
    I found that just updating the drivers from stock to 163.44 increased my 3dmark06 score by 100.
     
  5. KitnaMiracle

    KitnaMiracle Notebook Enthusiast

    Reputations:
    0
    Messages:
    32
    Likes Received:
    0
    Trophy Points:
    15
    well, im on windows xp home (32bit), so which drivers should i use? or where can i find out information on best drivers for each OS? Also, how much of an overclock is recommended to experience significant increase in gaming performance without the need for cooling accessories and a decrease in gpu life?
     
  6. LlamaOne

    LlamaOne Notebook Enthusiast

    Reputations:
    0
    Messages:
    22
    Likes Received:
    0
    Trophy Points:
    5
    Hi
    I have an Inspiron 1520 with 8400m
    I just overclocked my geforce 8400m gs from stock 400/800/400 to:
    Core:620Mhz, Shaders:1450Mhz (WOW!) and memory just 500Mhz and this is what bugs me cause some of you seemed to have reached a lot higher with the memory.

    I can go a bit higher with the memory but beyond 500 atitool reports artifacts. So i was wondering has anyone managed to get beyond 500 without artifacts is my memory crap or what?

    Also i couldn't overclock on any other drivers then the dell driver i tried 158, 16x etc, so after i found my stable clean max freq with the dell driver I just modified the bios with nibitor and nvflash and now i can use any driver and the overclock is now embedded in the bios:D also cool because i could oc the shaders much higher. I will try to atach my bios cfg.

    I have to say i am impressed with the oc of the gpu (max temp so far after 3dmark03 67C) but not so with the memory :( . And has I have seen, memory bandwith is the bottleneck with this card. When will pcb makers finnaly drop 64bit busses for graphics? Cmon it's 2007 for crying out loud:D even my old geforce 2 had 128bit bus:D
     

    Attached Files:

    Last edited by a moderator: Jan 29, 2015
  7. mrg666

    mrg666 Notebook Evangelist

    Reputations:
    147
    Messages:
    376
    Likes Received:
    0
    Trophy Points:
    30
    Hi LlamaOne,
    I also used nibitor 3.5 but there is a problem. My laptop is almost exactly same with yours which is a Vostro 1500 with 8400M GS.
    - When I read the BIOS with nibitor it said that the device was not recognized. I chose 8xxx series and read the BIOS into nibitor.
    - Edited the BIOS mods and saved to disk.
    - When I tried to flash the modded BIOS using nvflash, nvflash warned that the Subsystem IDs did not match. After seeing this warning I stopped flash and did not continue.

    The subsystem ID in the BIOS nibitor saved is 01F1 and the real one is 0228. Did you see this problem as well and how did you resolve? can you please tell?
     
    Last edited by a moderator: Jan 29, 2015
  8. mrg666

    mrg666 Notebook Evangelist

    Reputations:
    147
    Messages:
    376
    Likes Received:
    0
    Trophy Points:
    30
    Well, never mind. I could not resist and flashed the BIOS to my 8400M-GS in the Vostro 1500 with different subsystem ID finally. And, it worked.

    Using Nibitor 3.5, I increased the default 400/400 3D-performance clocks to 600/550. 600/550 was the maximum without artifacts in ATItool. I also reduced the 2D voltage to 1v from 1.15v in the bios. 2D voltage reduction should decrease GPU power by 25% at 2D clocks (100*(1-1/1.15^2).

    The followings are 3DMark06 benchmarks with default 101.38 driver from Dell and 163.71 driver.

    101.38 default clocks -> 1106
    101.38 overclocked -> 1452
    163.71 default clocks -> 1365
    163.71 overclocked -> 1815

    Maximum GPU core temperature was 62C while running 3DMark06 test with 163.71 driver when overclocked. I used Speedfan 4.33 and Atitool to measure the temperatures; they both gave the same temperature.

    So, overclock + driver upgrade gave me 64% higher 3D performance. I am ready for Half Life 2 Episode Two :)

    PS: I forgot to include these
    CPU: T5470 1.6GHz Core2 Duo
    Screen: 1680x1050 WSXGA+
    3DMark06 ran at 1280x1024
     
  9. LlamaOne

    LlamaOne Notebook Enthusiast

    Reputations:
    0
    Messages:
    22
    Likes Received:
    0
    Trophy Points:
    5
    Hi mrg666
    My device id is 0427 (unsupported) but i did not have your problem.
    I was afraid to play with the voltages but now that you tried it first I had to do it myself so I increased 3d voltage to 1.2V and got to 650/1500 core/shader :D.
    Also I set my 2d voltage to 1V and with 90/90 core/memory it eats 15.3-15.7W idling no difference from stock frequencies.
    So you got to 550Mhz with the memory... Better then mine but still nowhere near the 700-800 I've seen some on this thread.. so maybe those guys really don't mind having a bunch of nasty artifacts in their games :D or am I just being jealous?

    3dmark2k6: 2060 @1024*768 with 650/1500/500 was 1300something with stock freq and 163.44
     
  10. mrg666

    mrg666 Notebook Evangelist

    Reputations:
    147
    Messages:
    376
    Likes Received:
    0
    Trophy Points:
    30
    I added more data in my previous post about the CPU and screen resolution to explain why my score is lower than yours; my 3DMark06 screen resolution is higher.

    Atitool did not allow me play with shader frequency while searching for the highest stable frequencies. When I was changing the core frequency, the ratio was always 2 between Core/Shader frequencies. So, while setting Core frequency to 600 in Nibitor , I have also set the shader frequency to 1200 to preserve 2 ratio. Is there any specific reason you chose 1500 for the shader frequency?

    Oh, by the way, how did you detect that wattage value?

    Thanks,
     
Thread Status:
Not open for further replies.

Share This Page