The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous page

    Highest 8600m GT Overclock

    Discussion in 'Dell' started by bsang, Aug 1, 2007.

  1. Udi

    Udi Notebook Consultant

    Reputations:
    42
    Messages:
    165
    Likes Received:
    0
    Trophy Points:
    30
    You got 2513/2415/2386

    I got 2516/2331/2397

    Hmm... wonder what that SM3.0/HDR is dependent on... seems to be where mine has fallen behind.
     
  2. halkyon

    halkyon Notebook Consultant

    Reputations:
    6
    Messages:
    112
    Likes Received:
    0
    Trophy Points:
    30
    I am. ATiTools gave a few artifacts on 700 core for me, reducing it to 685 seems to have no artifacts, so I'm staying at 685/1000 for now. Just checked the temperatures at these overclocked settings, they are higher than the stock temp by ~2-3 C.
     
  3. Udi

    Udi Notebook Consultant

    Reputations:
    42
    Messages:
    165
    Likes Received:
    0
    Trophy Points:
    30
    Are you using rivatuner or atitool for the actual overclocking?
    I'm using riva 2.02.. for some reason the newer version didn't like me saving presets (I think).
     
  4. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    I'm using Rivatuner 2.07, on XP. With 175.80 I can't go past 615/515 without artifacting in ATITool, so this is right about what my peak will be. I get right around 4310-4320 with it. In games like TF2 I don't break 70 C or even 68 C but Mass Effect stressed my GPU a lot more (CPU too) so I've hit 73 C at those clocks. Still well within operating limits.
     
  5. halkyon

    halkyon Notebook Consultant

    Reputations:
    6
    Messages:
    112
    Likes Received:
    0
    Trophy Points:
    30
    I was using RivaTuner, but I'm using ATITool now because of the artifacts discovery thing that comes with it. It also shows a little number in the system tray of the GPU temp. :)
     
  6. Udi

    Udi Notebook Consultant

    Reputations:
    42
    Messages:
    165
    Likes Received:
    0
    Trophy Points:
    30
    That's a pretty good score Jlbrightbill, the DDR3 only gets a hair more than that out of the box. Think you'll be running most new games pretty decently.

    halkyon - NFSU2 started artifacting at 700 core, and was still doing it at 688 core. Dropped it to 685 and she's sweet. I'm surprised our cards seem so consistent, with all the variation that's supposed to happen in chips.

    By the way, the clock limitations seem to be distributed amongst core and memory, so dropping the core to 685 allows the memory to max out at 1030. I'll do some testing at those settings and see how it goes.

    Still wondering why my setup topped out ~85 marks under yours given pretty similar configs. I have some junk installed on my OS like you which might be affecting it, will try a clean install when I have some spare time. Tempted to make an image of the installation next time I do it.

    Edit - nice, I might switch to ATITool, the artifact discovery thing sounds cool. Does it allow you to save profiles/presets like rivatuner? And can you check if it will allow a setting of 685 / 1030?
     
  7. halkyon

    halkyon Notebook Consultant

    Reputations:
    6
    Messages:
    112
    Likes Received:
    0
    Trophy Points:
    30
    It does indeed. There's a dropdown list of clock presets which you can create/save/load, I've got mine set already. The artifact discovery thing is the best part of it, and there's also some tools which attempts to find the highest GPU/memory clocks (doing automatic artifact discovery after trying a higher clock speed). As far as I can tell, it appears to allow any clock setting too, without a maximum like RivaTuner has.
     
  8. tomk7

    tomk7 Notebook Guru

    Reputations:
    0
    Messages:
    72
    Likes Received:
    0
    Trophy Points:
    15
    Will I be able to set up ATI tool so that on the startup of my system, everything is at stock, but then when I run a 3D game it overclocks it?
     
  9. conzy

    conzy Notebook Guru

    Reputations:
    0
    Messages:
    69
    Likes Received:
    0
    Trophy Points:
    15
    I know it can be done with Rivatuner by selecting "apply settings at startup", but I am not as familiar with ATI Tool...

    What you want to do is set 2D clocks to stock (Or lower) and set 3D clocks to your Overclocked settings, Then when a 3D app is launched, the clocks will go to your overclocked settings
     
  10. jfdube

    jfdube Notebook Evangelist

    Reputations:
    21
    Messages:
    320
    Likes Received:
    0
    Trophy Points:
    30
  11. Brutality

    Brutality Notebook Guru

    Reputations:
    0
    Messages:
    50
    Likes Received:
    0
    Trophy Points:
    15
    hey guys im new to overclocking... when you right click on the desktop and go to properties and then MSI clock I can see an option to move some 2d and 3d clock speeds along with the memory... is this for OC'ing? If so, whats the point of using Rivatuner or ATItool?

    Also, out of the box, is the XPS 1530 plenty for gaming or must I OC it to recieve good performance? Thanks guys!

    ~B
     
  12. jfdube

    jfdube Notebook Evangelist

    Reputations:
    21
    Messages:
    320
    Likes Received:
    0
    Trophy Points:
    30
    Anything that moves the card's clocks is overclocking yes. Rivatuner, ATItool, Ntune, etc. are simply popular overclocking tools. Use whatever you'd like.

    As for the necessity to OC, it's entirely up to you, and the level or performance you wish to attain in the games you play. Trial and tweaks. Thats what its all about.
     
  13. halkyon

    halkyon Notebook Consultant

    Reputations:
    6
    Messages:
    112
    Likes Received:
    0
    Trophy Points:
    30
    You could probably put it in the "Startup" folder so it automatically starts up.

    Out of the box it provides already pretty good 3D performance. However, if you're looking to play higher end games like Crysis, or BioShock, then OC'ing the GPU is probably a good idea.
     
  14. Sepharite

    Sepharite Notebook Consultant

    Reputations:
    9
    Messages:
    273
    Likes Received:
    1
    Trophy Points:
    31
    What's the best driver for this card?
     
  15. Brutality

    Brutality Notebook Guru

    Reputations:
    0
    Messages:
    50
    Likes Received:
    0
    Trophy Points:
    15
    cool, when people say like 615/515... which numbers are they referring to, the 3D and 2D? Or the core clock/memory clock? My desktop says...

    2D-350Mhz
    3D-350Mhz
    Memory Clock- 1000Mhz...

    when I get the 1530, what should I put it at for something that will work better but not overheat/hurt the lappy? Thanks again guys.

    ~B
     
  16. halkyon

    halkyon Notebook Consultant

    Reputations:
    6
    Messages:
    112
    Likes Received:
    0
    Trophy Points:
    30
    From my experience, 174.31 is the best driver.

    "Performance 3D" mode. Any other mode probably is best left at stock clocks.

    I'd say a safe overclock would be about 600/800 on "Performance 3D", however, I've found that 685/1000 still doesn't seem to hurt the lappy in my case, as temperatures are only ~2-3 C higher than stock under load.
     
  17. tomk7

    tomk7 Notebook Guru

    Reputations:
    0
    Messages:
    72
    Likes Received:
    0
    Trophy Points:
    15
    I only ever play games for 2 hours at a time MAX. After that I get too tired/bored.

    Could I get away with overclocking a bit too high as its only going to be on for an hour usually?
     
  18. halkyon

    halkyon Notebook Consultant

    Reputations:
    6
    Messages:
    112
    Likes Received:
    0
    Trophy Points:
    30
    You could. Thanks to ATITool or RivaTuner, there's presets that you can define, so once you've finished gaming you can place the clocks back into stock default. This means you're only stressing the GPU for a short amount of time. However, I believe PowerMizer, which is included and enabled by default in NVIDIA drivers already does a good job at managing the load, so if you're not doing anything demanding, then it switches into a "standard 3D" or "2D" mode, which has lower clocks, and thus has a lower temperature - this is a very useful feature for extending the GPU life.

    Udi and I both have a maximum of 685/1000 for GPU/memory respectively. This appears to be the absolute maximum overclock, as any setting beyond that produces artifacts and instability. This only applies to the XPS M1530 with the 8600M GT GDDR3, other hardware will have different limits.

    I recommend using ATITool 0.27 beta 2 (beta 4 crashes for me), as it also shows you the temperature in your system tray. This is very useful for a quick glance at how the temp is looking.
     
  19. tomk7

    tomk7 Notebook Guru

    Reputations:
    0
    Messages:
    72
    Likes Received:
    0
    Trophy Points:
    15
    Cheers! +rep

    Does this PowerMizer only affect GPU clocks? I thought I read something about it also halving the CPU multiplier when running on batteries.

    Either way I also saw on laptopvideo2go that you can edit the INF file so that powermizer doesn't work anyway.
     
  20. halkyon

    halkyon Notebook Consultant

    Reputations:
    6
    Messages:
    112
    Likes Received:
    0
    Trophy Points:
    30
    Apparently it only affects GPU clocks. AFAIK, the CPU is throttled based on the BIOS, or some other mainboard feature.
     
  21. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    I recommend undervolting your CPU to cut down on heat if you plan on overclocking. There's a great tutorial by flipfire in the Hardware forum. I dropped my peak load temps by around 10 C.
     
  22. tomk7

    tomk7 Notebook Guru

    Reputations:
    0
    Messages:
    72
    Likes Received:
    0
    Trophy Points:
    15
    I really want to undervolt my CPU but I don't want that RMclock running all the time so I'm not going to.
     
  23. Udi

    Udi Notebook Consultant

    Reputations:
    42
    Messages:
    165
    Likes Received:
    0
    Trophy Points:
    30
    Heat isn't the limitation of overclocking the GPU in this notebook anyway, temps actually stay fairly low. The hardware just starts playing up beyond a certain speed (missing calculations etc) hence the artifacting, usually before it even gets hot.

    Don't think there's a need to undervolt. I tend to agree with you, hate extra programs running... I just start ATITool/Riva after boot, set clocks, and close the application completely (unless you want to monitor temps, but you only need to do that once really). Works great.
     
  24. tomk7

    tomk7 Notebook Guru

    Reputations:
    0
    Messages:
    72
    Likes Received:
    0
    Trophy Points:
    15
    The only thing is I'd prefer it if I could use the 3D detection on ATItool so that it would only load the overclocked profile when a 3D app was running.

    That way when I'm just on the internet or whatever the GPU can take it easy. But the only way to do that is to have ATI tool constantly running so I'm not sure.

    BTW what is the best version of Rivatuner?
     
  25. halkyon

    halkyon Notebook Consultant

    Reputations:
    6
    Messages:
    112
    Likes Received:
    0
    Trophy Points:
    30
    If you are just surfing, doing normal stuff then it's safe to leave the GPU clocked higher. This is assuming you're only overclocking the "performance" clocks, as the GPU never uses those until you're playing games that start demanding more on the GPU, in any other case it should always use lower clocks ("2D" and "standard"). I believe this is a feature of PowerMizer, that comes with the NVIDIA drivers.
     
  26. tomk7

    tomk7 Notebook Guru

    Reputations:
    0
    Messages:
    72
    Likes Received:
    0
    Trophy Points:
    15
    Thanks. I've never seen that option on ATI tool though.

    IT seems a if it's either overclocked or not. I am probably using old drivers though.
     
  27. halkyon

    halkyon Notebook Consultant

    Reputations:
    6
    Messages:
    112
    Likes Received:
    0
    Trophy Points:
    30
    Yeah, that's true. It's a very useful feature for laptops though! :)
     
  28. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    You have 4 GB of RAM and a T9300 -- There's no reason you couldn't have RMClock or Rivatuner running in the background.
     
  29. mmhorda

    mmhorda Notebook Enthusiast

    Reputations:
    0
    Messages:
    10
    Likes Received:
    0
    Trophy Points:
    5
    XPS 1530
    Core 2 Duo 2.4Mhz (3Mb cashe)
    2 Gb RAM
    Windows Vista (NO SP1) - fully updated
    No overckloks

    3DMark 06
    1280x1024
    Score: 4101
     
  30. mattocs

    mattocs Notebook Deity

    Reputations:
    13
    Messages:
    732
    Likes Received:
    0
    Trophy Points:
    30
    I did a bit of overclocking...

    3dMark06 @ 1280x800

    [​IMG]

    3dMark06 @ 1440x900

    [​IMG]
     
  31. SomeFormOFhuman

    SomeFormOFhuman has the dumbest username.

    Reputations:
    1,037
    Messages:
    1,012
    Likes Received:
    0
    Trophy Points:
    55
    Just updated my score.

    Stats:

    Dell Inspiron 1720
    Windows Vista Home Premium
    8600M GT DDR2 (Using 175.80 8700M GT drivers)
    Highest Temp: Appox ~ 65-68 degrees
    1280x1024 = 4342 Marks

    Click to Enlarge:
    [​IMG]
     
  32. Just Lou

    Just Lou Notebook Evangelist

    Reputations:
    62
    Messages:
    349
    Likes Received:
    0
    Trophy Points:
    30
    Wow 700+ on the core. I wish I was brave enough to try. ;)
     
  33. SomeFormOFhuman

    SomeFormOFhuman has the dumbest username.

    Reputations:
    1,037
    Messages:
    1,012
    Likes Received:
    0
    Trophy Points:
    55
    No actually, I'm running on 650 core. RT isn't accurate in reading frequencies accurately - So says the creator of RT in the guru3d forums: "not supporting on mobile platforms, never will and never be".
     
  34. wywern209

    wywern209 NBR Dark Knight

    Reputations:
    47
    Messages:
    979
    Likes Received:
    0
    Trophy Points:
    30
    u need to use the 17x.xx series for the best performance
     
  35. Just Lou

    Just Lou Notebook Evangelist

    Reputations:
    62
    Messages:
    349
    Likes Received:
    0
    Trophy Points:
    30
    Oh OK. Glad you said something before I tried. :) I can run mine at 650 with no problems, although I don't leave it that high.
     
  36. SomeFormOFhuman

    SomeFormOFhuman has the dumbest username.

    Reputations:
    1,037
    Messages:
    1,012
    Likes Received:
    0
    Trophy Points:
    55
    And isn't 175.80 one of the best for an 8600M GT that I'm using? If there's a score higher than mine with the same configurations but different driver then I like to see it. :D
     
  37. SomeFormOFhuman

    SomeFormOFhuman has the dumbest username.

    Reputations:
    1,037
    Messages:
    1,012
    Likes Received:
    0
    Trophy Points:
    55
    But so far so good, no artifacts as yet. :p Been using these speeds all the time @ 650/1300/526. I usually clock it down to default 475/400 when I'm not gaming and at startup.
     
  38. SomeFormOFhuman

    SomeFormOFhuman has the dumbest username.

    Reputations:
    1,037
    Messages:
    1,012
    Likes Received:
    0
    Trophy Points:
    55
  39. lancerr

    lancerr Notebook Consultant

    Reputations:
    1
    Messages:
    129
    Likes Received:
    0
    Trophy Points:
    30
    Here were my clocks settings and 3dMark06 score - I settled

    First one was stock when it came from Dell. I settled on 601/902 since the difference in improvement wasn't great and I wanted a stable system.

    3903 - 153.XXX
    4117 - 174.31
    4522 - 174.31 Clock changed to 500/902
    5156 - 174.31 Clock changed to 601/902
    5289 - 174.31 Clock changed to 625/902
     
  40. cehennemli

    cehennemli Notebook Enthusiast

    Reputations:
    0
    Messages:
    19
    Likes Received:
    0
    Trophy Points:
    5
    hi, I have Acer 5920G 933G25MN, 8600M GT GDDR2 512 mb.
    I try to o/c but I Can't mem. clock up to 460 MHZ. I use 660 / 458, but I can't 660 /460 or 470...... why? (sorry I know a little English)
    What can I do?
     
  41. gskuse

    gskuse Newbie

    Reputations:
    0
    Messages:
    1
    Likes Received:
    0
    Trophy Points:
    5
  42. Seliverus

    Seliverus Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5
    there that's as good as I can push it.

    i213.photobucket.com/albums/cc55/CMStudio/superscore.jpg
    it wont let me post urls so thats why its not a link

    3dmark06 score: 7243 @ 1024x768

    Laptop: Dell XPS m1530 Vista 32bit SP1
    Processor: T8300 Core2Duo 2.4ghz 800fsb
    Ram: 4gb ddr2 dual channel 667mhz
    GPU: 8600mGT 256mb DDR3

    GPU Modifications:
    178.13 drivers for 8600gts (windows update) (RTstraped)
    (RToverclocked) core 680mhz | shaderer 1360mhz | memory 1030mhz
    (ATItool tested for 15mins)

    do I need to add anything else?
     
  43. yomamasfavourite

    yomamasfavourite Notebook Evangelist

    Reputations:
    51
    Messages:
    681
    Likes Received:
    0
    Trophy Points:
    30
    Firstly - you shouldn't revive dead or near dead threads
    (this is really a 3/4 ish month old one) much better to start a new one that people will look at.. :)

    Secondly, I think for it to be a true 3dmark score it has to be run at 1280*1024, obviously to have consistency.
     
  44. Seliverus

    Seliverus Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5
    Sorry, I thought this was some sort of standardized thread to find the top score because the thread has been fueled for over a year. my bad >.<*

    I ran it at 1440x900 because 3dmark06 on this laptop does not have the 1280x1024 option

    Score: 5945
    HWmonitor showed a max GPU temp of 78C

    Im quite happy with the 8600m GT and am looking forward to booting this laptop to 64bit and upgrading to 8gb ram in the future; possibly even invest in a better processor ^-^
     
  45. MechMykl

    MechMykl Newbie

    Reputations:
    0
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
    Hey, I wanted to thank everybody for their posts on this forum, they helped me a lot! I have a 15 inch Macbook Pro (2008 "Multibody"/Santa Rosa) that I have boot camped to Windows XP 32bit.

    Stats:
    Intel(R) Core(TM)2 Duo CPU T9300 @ 2.50GHz
    NVIDIA GeForce 8600M GT 512 MB GDDR3 [Hardware Monitor says driver is: 6.14.11.6763]
    CORSAIR 4GB (2 x 2GB) DDR2 667 (PC2 5300)
    Stock: 470/635; Idles at ~57C / 3DMark06 4094 (1280x1024)
    Overclocked with ATITool: 595/700 at ~80C / 3DMark06 4895 (1280x1024)

    As a test, when I have overclocking turned on, it gives me anywhere between a 10-20fps boost in Left 4 Dead with all high at 1280x800.

    I tried Core: 600 but it brought 2 artifacts for 1 second every 5 minutes.
    I also tried 680/1000 but it crashed my driver! haha.

    Any tips on what to do about the memory slider? I took away anywhere between 25-50 on the slider and it seemed to have no effect.
     
  46. Apollo13

    Apollo13 100% 16:10 Screens

    Reputations:
    1,432
    Messages:
    2,578
    Likes Received:
    210
    Trophy Points:
    81
    Firstly, be more careful with your overclock! If a core clock of 600 gives artifacts, don't try pushing it any further! Going all the way to 680/1000 was very reckless. That's the sort of thing that fries video cards. Increasing by 5 MHz or so at a time until you get artifacts is recommended.

    As for the memory slider, I'd just add 5 or 10 MHz and run 3dmark06. There ought to be a small difference. If it isn't doing anything, try RivaTuner or nTune - ATITool naturally doesn't have exceptional nVIDIA support. Also be aware that the temperature sensor on the GPU is nearer to the core than to the memory, so overclocking the memory is a bit more risky - you won't know its exact temperature. I stick to core overclocking myself to be cautious, but so long as you keep moderate overclocks, you should be okay.

    Also, your video card driver is pretty old - version 167.63 (the last five digits). You might want to download a newer one at www.laptopvideo2go.com.
     
  47. MechMykl

    MechMykl Newbie

    Reputations:
    0
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
    I tried the 680 overclock before I even knew that the 600 overclock was too much for me... I just saw a high number in one of the posts and said "why not on mine?" Sorry, I'm brand new to this and learning as I go!

    I overclocked the memory by 50 and I was able to get 4994 3dmarks with everything else at the same settings with only 1 degree of heat difference.

    With my base overclock (posted earlier) my fan speed would idle at 2000rpm and then hit a higher level (somewhere between 4000 and it's max of 6000, sadly there's no way that ive found to adjust the fan speed manually on xp, looked at a lot of programs.. any tips?) when the temperature got to 85 and then it would keep things at a cool ~80.

    The new memory test of 50+ (done in increments of 10) gained me a degree before the fans started up and then kept the same idle temperature... is it worth it? I've heard GPU's like this one are fine anywhere between 80-100C but i'd like to stay in the safeish zone.

    BTW: Anywhere under 40+ never gained me a degree before the fans started up, but still got me to 4962 marks.

    One last question: do you recommend that I upgrade to the latest drivers (185.20 XP 32bit - 2008-12-26) or should I use a previous one that's been tested stable? Thank you so much again for your help!
     
  48. Hualsay

    Hualsay Notebook Evangelist

    Reputations:
    145
    Messages:
    554
    Likes Received:
    0
    Trophy Points:
    30
    wow thats insane :eek:
     
  49. MechMykl

    MechMykl Newbie

    Reputations:
    0
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
    Bump.

    Just wondering if the latest drivers are the best way to go.

    Also wondering if 85 is all that preferrable to 86 for the 10 seconds that it's like that before it levels out at 80C.
     
  50. Apollo13

    Apollo13 100% 16:10 Screens

    Reputations:
    1,432
    Messages:
    2,578
    Likes Received:
    210
    Trophy Points:
    81
    As for drivers - I don't know enough about each particular one to know which is best per se, although they generally improve with time. I generally stick with whatever I currently have unless it's giving me problems, and if it's been a few months I might give newer ones a try. If 185.20 hasn't given you any problems, you might as well stick with it. Or ask at laptopvideo2go.com - I'm sure there's someone there who could tell you which recent one has a particularly stellar stability/overclocking record. I've been running 181.20 Dox optimized for about a month and a half myself; it gave me a BSOD on Windows 7 (with the Vista 64-bit drivers) but has been excellent on XP 32-bit thus far.

    I wouldn't worry too much about one degree in the mid-80's range.

    As for a fan program, I've heard some people say Speedfan is a good one. Since I have a Dell, I use I8Kfangui, which works on pretty much any Dell, but it won't work on an Apple. The hardware forum or Apple forum might be able to help you more there if Speedfan doesn't work - most people in the Dell forum use I8kfangui because it's a nice simple solution that works for Dell. I'm not aware of any similar programs specific to any other manufacturer.
     
← Previous page