So you need VBIOS mod = NiBiTor + nvflash.
Read NiBiTor manual carefully!
P.S.: My GTS160M has possible only two VIDs - 0,9 and 1,09 V (nothing between them) - undervolt of lower P-states is impossible. Luckily that card can run on little OC with 0,9 V![]()
-
The GTX will blow up at 700/1750/1000. The GTS will do it but not the GTX.
GTA settings that I use: First of all: for nVidia the best is the 1.0.4.0 patch. 1680x1050(native) all high(can go to very high but not much a difference), 30-100-100-0 (I don't want that power hungry dynamic shadow thing) vsync is on and definition is on. With these settings I get over 40fps in any situation maybe when hundreds of cops shooting at me and blowing up 20+ police cars with rpg it goes below 30 but this situation is very rare.
And Also I have a selfmade texture mod that give me 10 more fps.
Edit.: According to GPU-Z my BIOS is 62.92.78.00.08, Clevo/KAPOK(1558). And its idling at 41 for now at 0.85V -
Yes those clocks might be a bit far-fetched. I also noticed that the laptop isn't using one of the cards connection finger. The previous owner's laptop was. I don't know what difference that makes but it may limit the card's power draw somewhat. I didn't mention the exact bios version but now that I checked it's the same as yours.
I don't really have a choice about what patch I run because I got the game via steam, but I understand that you're running shadows all-out? Because I definitely can't do that and my framerates are below yours at lower settings. Maybe the quad core is making quite a difference after all... Why do you have Vsync on btw?
I experimented a bit with clocks, now flying a very safe 600/1500/900. Tried running the memory at 1000MHz but this caused the card to remain stuck in P8 videomode. Tried to edit to P8 to run full speed anyway, and saw the card switch to throttle clocks (temps were normal), even worse. So its definitely not liking 1GHz mem.
In short, I'm still struggling a bit to get this configuration to work properly. It seems Throttlestop is causing random crashes trying to run games, and in-browser I get a lot of the dreaded Google Chrome 'aww-snap'-errors. Maybe 1,1V is too low but it worked in the beginning so I'm a bit confused. -
Is it hard to re-flash the videocard's BIOS? For the exact same or an older but maybe compatible one.
-
No, flashing of VBIOS is easy (I recommend DOS-mode).
-
Ok, minor update, I think I finally got this down. CPU required a bit more juice to operate smoothly, had to bump my Vcore to 1,15V to stop getting errors. I settled with 600/1500/900 for the GPU because it already nearly hits 100 degrees with the bottom cover on, and I'm happy to run GTS160M speeds with nearly twice the amount of shaders
Performance in GTA IV is still below par at times. Kirrr, did you play with or without shadows? And why are you looking to flash your bios if it's currently working fine?
Btw the USB3.0 card thingy still hasn't arrived. Ordered nearly two weeks ago, not saying I've been scammed yet but this is taking awfully long -
Actually I playing with this settings: all high, shadows very high, 30-100-100-0 on, on. Patch 1.0.4.0 and a more realistic but optimized texture pack. Plays fluently just some frame skipping in action scenes. (mentioned it before) It really needs more cores than two. And actually my quad is at 3.2GHz per core + the stock Clevo clocks on the GTX.
No HDMI for me now, so that the reason of the flashing. Done it yesterday, but HDMI still not working. Fortunately my VGA port works, but HDMI is more comfortable for me because the cable management is easier with it. Is your HDMI working?
I got the USB card fast, maybe just 6-8 workdays. I think it was pretty fast from Hongkong to Hungary. -
Hmmm, I sold the QX9200 so going back isn't an option. I'll settle for around 30FPS on current settings... Maybe it's time to accept that I need my desktop for the real eye-candy. If only liquid cooling weren't such a pita to set up
Funny thing is I don't remember GTA IV performance being much better with the QX9200 + GT160M, so I assumed the GPU was bottlenecking worst.
Could you perhaps link to the texture pack you're using? I'm really wondering what kind of impact it would make. I have some minor texture popup issues and can definitely not turn up the line of sight stuff as far as you are. Let's see if I can get a texture pack working with steam in the first place...
I don't use HDMI so haven't gotten around to seeing if it works. I could check for you, but I expect it's broken since we're using the same card/bios. -
I don't remember where from I download the mod, but I modded it myself too. By the way its 4.5GB
If I find a place I will upload it, so you can download it.
Please try the HDMI if you have time. Thank you. -
Unfortunately, HDMI didn't work with my laptop either. Windows did give the little beep to indicate hardware was detected when I plugged in an HDMI screen, but I couldn't get image even after a reboot.
I just got MW3 and despite it being almost identical to MW2 which runs fine with medium-high settings on this laptop, it runs ty. With fraps I see the GPU cranks out 91fps (the maximum) steady when I turn all the settings down, but somehow the game uses so much CPU it lags badly and even froze several times. This makes multiplayer unplayable. CPU usage is over 90% most of the time. Temps for both CPU and GPU aren't even near what I hit in GTA IV. I googled but seem to be the only one experiencing these issues, however I strongly suspect this is a case of bad programming.
Black Ops also had terrible CPU issues at launch, though it is weird MW3 suffers from this because there've been few changes to the game engine as far as I know... -
It is very cpu demand now. Use this old engine with tons of textures and eye candy visuals made it powerhungry . Native resolution, no aa or aniso, everything maxed except ambient occlusion give me a decent 50-70FPS. If I turn off the shadows and soften smoke edges it goes constantly over 90.
Man, this machine still rocks!
So the HDMI will never work. Shame. But hopefully I can trade my GTX to a Quadro Fx 3700m. Moral Hazard got it to work with HDMI so maybe that's the way out from this situation. It maybe a bit hotter, but as soon as the HDMI work I'll use it as a "portable desktop" computer. Take off the bottom cover put it on my cooler 24/7 and use it with my external screen. This VGA hardly never get above 70*C without bottom cover and with my cooler so maybe it can deal with the Quadro as well. -
Yeah I found out the problem could be remedied by turning off theater mode. No idea why it was enabled by default but it's an extremely CPU-heavy feature. CPU usage now is down by about 30%.
Not that it makes much of a difference, I finally got my main rig up and running again so won't be using the laptop as often anymore...
I'm glad I've got no use for HDMI, swapping a perfectly fine GTX260M for a Quadro seems like a bad deal to me but if you won't be using the laptop for mobile applications what gives. I just think you ought to consider building yourself a fast desktop instead -
Yeah, but I like the portability if I want. And the Quadro will work no question, tried it some months ago. If the HDMI won't work with the quadro, I just gained some more shaders.
-
Well let us know how the temps are when you get there
Btw my USB 3.0 adapter finally arrived just before last weekend, it works like a charm but has the same awkward fit you described earlier. Haven't tested if it actually reaches full USB 3.0 speeds as having the extra slots was more important than them being faster, oh well.
It's a shame it doesn't deliver power when the laptop is powered down, because USB 3.0 would allow for faster battery charging of connected devices, and I usually hook my phone up to it at night. -
Big day today! Finally I managed the HDMI to work properly at native resolution. Thank you niffcreature. You are the best!
the trick: .10 HP bios from the FX 3700m. Adjusted clocks/voltages to match with the GTX. And it works! GPU-Z shows everything wrong (bios version, card type and so) but who cares?! -
That's good news! Any chance that BIOS could be edited to display the correct information, purely cosmetical of course? Where did you get it? What programs did you use for the editing and flashing? Any unlockable shaders on the card?
I might give this a shot and see if the voltages can be tampered with. Would be interesting to see if it will run 600/1500/900 at 850mV (probably not but heh, if it's only in 3D reflashing won't be a problem) -
I don't know how to edit it, but the clocks, the shaders are correct, just the bios, the vendors name and the manufacturing process is bad.
I get it from niffcreature. And I use nibitor to edit the clocks and nvflash to flash the card in DOS-mode. No unlockable shaders.
I set the voltages as the default gtx ones. Maybe later I will try to OC a little. -
I'm a bit confused. Are there only certain 1651's that support quad core cpus...?
-
-
You need 3 gray capacitor (or whatever) near the cpu socket. If they are there a quad should work.
-
Thanks Kirrr. It looks like mine only has 2. So I'm guessing I'm out of luck.
-
Your best option is a T9900 or maybe an X9100.
-
Meaker@Sager Company Representative
-
Those:
Picture from Darth Bane. -
Meaker@Sager Company Representative
Yep, inductors.
-
Right now I have an sp9400 in my 1651. -
moral hazard Notebook Nobel Laureate
It's very hard to find that info. Some users have had trouble with quads and for others it's fine.
Can only guess that the difference is that some components (inductors and a few black things) are the reason.
I think if you try a quad in a motherboard that has 2 inductors then you will probably be stuck with only 2 cores active. But that's just my guess (I read that one reseller tried a quad on a gx620 and he was stuck with 2 cores out of the 4). -
I remember being stuck with 2 cores upon the initial boot, after a reboot it was fine.
-
-
Singular1ty: What about the temps with the X9100?
-
Under heavy load I've watched the processor hit maybe 70 degrees maximum. I do notice having 2 fewer cores impacts performance in some games but now that my desktop is up and running blistering speeds this isn't much of an issue. Unfortunately it's also keeping me from further tampering with the notebook, so I haven't been overclocking the CPU or fooling around with the GPU's bios yet. -
That's sounds great. You have a cool one. Btw I've done a repaste and for now cpu maxed 80*C at Q9100 clock undervolted to the lowest possible 1.05V and gpu topped at 80*C. No external cooler, with backpanel on. I still have some hope for the bigger Dell with a two fan cooling system. That will be cooler I think.
-
I think im ready to buy a q9200. can anyone put a screenshot of the location of the 3 inductors so i could check my gt628 if they had them.
also, this is the cpu ive been eyeing for quite a while. do you think this seller is reputable?
link:
Q9200 2.4Ghz INTEL CORE QUAD QAVR extreme MOBILE CPU processor laptop monkey | eBay
thanks in advance! -
Picture: one page back. lol
I have the same cpu from Laptopmonkey as well. -
My Q9200 also came from Laptopmonkey, you won't be dissapointed
-
thanks guys. just confirmed that i have those 3 grey inductors near the socket. will be transfering funds to my paypal now then contact laptopmonkey.
this is my first time upgrading a component in a laptop. if you guys know some sort of a guide in replacing a laptop cpu then please kindly post it too. also, what else do i need to buy aside from a thermal paste? do i need thermal pads? whats the difference between ic diamond 7 and 24? TIA!!! -
Laptopmonkey will give you some paste but its a bag of krap. ICD7 will bee fine. I use some cheap thermaltake now and its work well.
Guide: detach the fan cable, then screw of the screws in the numbered order apply the thermal paste and reverse the progress. -
thanks for the reply Kirrr. ill be buying the cpu when i get home from work. what else do i need to do prior to the upgrade? do i need to flash to a newer bios? again, TIA!
-
No need for the newer bios.
-
thanks for the help. just bought the cpu from laptopmonkey and ic diamond 7 from frozen cpu. now the waiting game begins. will post again when i manage to get the things i bought. would add rep to you when im able to again.
-
#deleted#
Never be so optimistic... -
Too high voltage. Use Throttlestop and set it to 1.1625V the 1.3V is enough for 3.2GHz but too much for the stock clocks.
-
Hi Kirr. thanks for the tip. just installed throttlestop and i manage to bring down the voltage to 1.1. im not sure if im doing things right. here's the screenshot. ill be leaving this to around past 9pm.
Undervolting the Elitebook 8530′s processor with Throttlestop (part 5 of 10) | Jake on the move
i followed this guide btw.
-
A bit high temps. With or without a cooler/bottom cover?
Uh, and I feel like I have some throttling issues. Maybe the AC adapter isn't enough? I don't know, but there is some stuttering in the most cpu+gpu hungry games. Yesterday GTA simply crashed to the desktop and there was a gpu downclock issue in team fortress 2. -
its quite hot in my room. (30ish deg C). thats without cooler and bottom cover is closed. anyway i tried moving the voltage down to 1.088v and here's what im getting.
this is with cooler but bottom closed. is it safe to leave the bottom cover open btw?
i also tried playing bf3 and its playable now around 30ish fps on low settings but my gpu temps goes up to 90ish which also affected my cpu temps.
-edit-
restarted prime95 with large FTTs setting with 1.088v bottom cover closed and with cooler.
core 0 and 1 rose to 82 and 80 respectively.
-
I use mine without the bottom panel for about 3-4 months with no problem.
-
hmm i removed the bottom cover and my idle temps did improve. now im thinking of modifying the bottom cover by adding more holes and some mesh. maybe next year ill buy a bottom cover from rk.
-
Meaker@Sager Company Representative
My Q9200 did 2.66ghz at 1.05v so maybe give that a try.
-
im not pretty confident in overclocking it yet as im still having temp problems with it. once i manage to deal with it i might try overclocking the cpu.
MSI GT628 with Q9200
Discussion in 'MSI' started by Kirrr, Feb 6, 2011.