Hey guys,
I have the 2.4 ghz MBP. At stock settings I am getting 4102 3dmark06 points at a resolution of 1280x800. Since I have a native res. of 1440x900, that is as hi as I can go. However, both Rivatuner and Nvidia NTune claim my graphics card is clocked at 375 core, and 502 memory. Is my card underclocked?
Also, when I boost up the core and memory speeds by 100mhz with NTune and benchmark, my 3dmark06 score GOES DOWN! I am very confused here. Lastly, RivaTuner has no effect on my 3dmark06 scores, it does not seem to work.
-
-
Yes, Apple has it slightly underclocked. This is probably done for heat issues.
-
Well, what program can I use in windows, other then atitools or rivatuner, which dont work, to put my card at stock speeds? Which, if I recall, are 470/600
-
Core Clock (MHz): 475
Memory Clock (MHz): 700
This are the speeds on the nvidia website. 4102 in 3dmark06 is a really good score... -
Mine (the 128MB 8600M Gt) is clocked at 470mhz for the core and 630mhz for the memory. Are you sure those are the maximum values? the card clocks down when not in use.
-
Rivatuner is rather confusing. I know what washington101 is talking about. If you click on the information button for the gpu, mine said something like core clock - 405, memory clock - 510. But when you gather clock speeds to overclock in the drivers section, rivatuner will say that the performance 3d values are 470 and 635 (the purported stock speeds). Either way, I'm using 162.18 drivers which are far better than stock drivers but I can't overclock with those drivers.
Is there a simple program that will allow us check the clock speeds? -
I don't think apple limits the clock speeds, rather they lower them when the card doesn't need it. I know they did this with the x1600, and I see no reason they wouldn't continue this system with the nVidia.
Doing this allows for maximum processing power when needed, but lower heat and energy draw when all you're doing is reading text or something similar to that. -
When I open RivaTuner, and use the drop down menu to select clock values for the "performance 3d" I get values of 375 and 502. I am also using Vista. This is beginning to upset me, as I would like my card to perform at least at stock.
Hmm....ok, so in RivaTuner, there is a box to the left of the core and memory clock. In these two white boxes I have 375 for core and 502 for memory. However, the blue slider bar for the core goes to a maximum of 565 and the slider bar for the memory goes to a max of 755. Is that the clock settings that the card will reach when it is being pushed? Or is it just the max the program can be used for? -
That is the maximum that rivatune would let you overclock too. Bit low for the core (you can easily reach 600 or more) and a bit too high for the memory (might be able to reach 755 but you would be pushing your luck).
Have you tried ATITool? I found it to work a bit better. (yes it works with nvidia cards) -
Wave,
I tried ATItool, but Vista said something like ATItool wants access to some kernel that affects your computer. We are terminating the program, something like that. I was not sure how to tell Vista to mind its own business! Any ideas? -
Hmmm yeah Vista.... No sorry cant help you there. I only have XP on my Macbook pro. Maybe someone else can help. You could try different drivers but I cant help you there too.
Your 3dMark score looks fine btw. I dont think your card is underclocked. If it was clocked at what you stated in the first post then it wouldnt get 4000+ points but rather by around 3500 by my estimate.
I have yet to find an application that tells me the GPU clock in OS X. Anybody know? -
-
What is the latest version of ATITool? Hmm...when i try ATITools the profile that is loaded is "default (0.00/0.00). How do I make it detect my video card? Also, when I hit the "find max core" it says "the video card you selected for overclocking in ATITool does not seem to be used by windows." I am so confused! Is there any way to overclock my card or to even see what the real clock speeds are? I am using version .26 of ATITools.
Damnit, if you go to the ATITools website, it says it does NOT work with Vista. Damn you Vista, you are so pretty, but so stupid! -
would it be possible to run your diagnostics programs and then play a game? Your clock speeds should show an increase only when they are put under load.
Also, I find it funny that you have a mac and you think Vista is pretty -
masterchef341 The guy from The Notebook
the macbook pro is certainly set to 470 / 635 for 3d load.
finally, ati tool can at least show me the temperature of the gpu in windows xp.
i need a program that can change the clocks though. any suggestions? i want to set a custom clock for the gpu (i don't mind doing this in bios / efi before windows starts)
MBP 8600gt Underclocked?
Discussion in 'Apple and Mac OS X' started by washington101, Jul 26, 2007.