Original thread from Xtremesystems:
http://www.xtremesystems.org/forums/showthread.php?t=218157
I hate to make a brazen projection like this, but unless NVidia comes out with SLI GT200 cards in the Clevo, this laptop will hold the performance crown for at least a year. There is simply no card even on NVidia's development roadmap that can come close to this until 2010.
-
-
-
The mobile GT200 is set for the Clevo D90xF and M980NU this year.
-
Just kiddingsomeone posted it in the Asus section already. Insane to say the least.
-
-
-
I think the D90xf would be able to handle SLi with the GT200 series. The biggest reason would be the reduction in power consumption from going to 55nm to 40nm and from GDDR3 to GDDR5. The difference in power consumption between the desktop GTX280 (65nm) and the GTX285 (55nm) is pretty significant.
-
dondadah88 Notebook Nobel Laureate
heres a random benchmark.
http://forum.notebookreview.com/gallery/data/500/medium/crysis1.JPG what are you all focused on now? -
-
yep, sure did give the maybe, but also stated that it wouldn't be happening with a mid ranged user on a low end setup...
i tip my hat to that score though. very impressive indeed!
so if a gtx ran on a 1600 plus bus with an over clocked cpu. it would definitely put it up there.
what held the 3870s back was that it wasn't to stable above 1150 mhz fsb running on a 8x pci express. had this been opened up to 16x...the 3870s would have scored in the 18k plus range. because they could then push the fsb to at least 1333mhz + over clocked cpu @ 3.6 to 3.9
edit:
for those with the 3870's.... look at these heat sinks and compare them to yours... you may need to buy more than just cards as upgrades....
-
How did they overclock have they already found the PLL for setFSB?? -
Also looks like, can't remember where I read this, that the new G200 (or whatever the GT2XYs are based on) are coming out in mobile form sometime this year... article didn't have a whole lot of info, just a link to some theoretical roadmap on nVidia's site IIRC...)
@johnksss
Crap! That thing must weight a ton with all of those heat sinks and pipes, plus 3 fans... wonder what the batt runtime was like...
Did that Russian guy that got the ASUS W90 with CFed 3870s ever post any benchs? -
and i m using old ati catalyst driver i m sure i can do better with the new one -
Guys, do you think a same system with Q9000 will perform better in 3D vantage?
-
i think a more experince member would be able to answer it better -
Oh right. I'm deciding to get a 725 with P9500 or a Advent 6555 with Q9000 for hundred quid cheaper.
So far it seems if the Advent is really very similar to 725, I can probably install the overclocking driver of 725, to have identical overclocking functionality as 725, I can end up with a 2.4 Ghz quad laptop, which seems far more future proof than the dual core 725, if OCing does not work on quad, 2Ghz quad ain't that bad... -
-
Sorry to hear that, btw, was it a P4 CPU though?
-
And looking at those pictures, looks like Asus went the MXM-IV route with these cards, though it looks kinda pointless because they don't have anything covering or placed on the HE tab. It's also good to know that a 230w PSU can still provide enough power for this thing. -
-
this one is an old one but i will re run them again once bk in uk
-
The W90 is supposed to weigh more than 6kg, which puts it in the same weight class or more of the D901C. -
dondadah88 Notebook Nobel Laureate
ok here's my farcry 2 i think the settings are the same. i had stutter issues for some reason. i used 8.12 e-wrecked's modded driver. crossfire help in this one but i dont know about a single card
oh i got to go to work i'll be back tonight at 10. if anyone knows why i had stutters just post. i'll read. Lol. -
uninstall it reinstall it still same problem so i got rid of it dont know why but it does nt like my lappy -
your good ichime!. i tried to zoom in to see what type of cards those we're and i couldn't tell.
230 probably can be pushed to to 270, but with the cpu coming in at 45 watts and over clocked gaining another 10 or so watts with vga cards at even 50 to 75 watts is still plenty of room for over clocking. the d901c's down fall with using a desktop quad cpu
starting at 95 and going up to 130 to 150 watts. no room for over clocking -
-
yes, i would hope nvidia got wise with that rebran thing when they clearly got busted with the 8800mgtx/9800m gt. lets hope they don't try that one yet again.
but if they do in fact have the fsb working and an over clock able quad...then things will start to even out yet again.... hopefully clevo is watching this and will play accordingly. the 9800m gtx's should pull in at:
quads
2.0 7600-7800
3.0 9000 to 9600
cpu 5200 to 6000
right now
2.0 6000-6600
3.0 7400-8100
cpu 4200-4800
and the g280 being higher...if it's not some rebran nonsense
this is why i mentioned that we still have yet to see the 9800m gtx full potential, where as from the gate, we are seeing the 4870's full potential. it has the right system to make it shine.
and that guy that did the over clocking is a real over clocker so we may see it hit 21,000 in a month...who knows. -
But let's say there was a W90 system with an SLi chipset instead. I'm positive that coupled with overclocked GTs or even GTXs with the FSB pumping would net 20k easily as well. -
I just receive a mail that confirmed the avaiability of the msi gt7125 in the US market in feb 23.
-
That falls in line exactly with what Justin said.
-
And if it did break 20k, it would only be with a 4GHz i7, which doesn't count anyway because SM 3.0 subscores are the only thing that matters in 3DMark06.
It doesn't matter if the 9800M GTX ever reaches desktop speeds, you can't recover a 9% loss in shaders compared to the desktop version. The 48xx Mobility cards have all 800 shaders intact so if you match clocks, you match desktop performance. You can't do that with NVidia.
I'm not down on NVidia as a company, I just think it's shoddy to cripple your mobile cards. Both companies lower clocks for thermal reasons: NVidia's 9800M GTX is a 9800M GT though to go with lower clocks, ATI's 4870 and 4850 are the same card specs, just lower clocks. -
sife note:
and all my over clocks are well under everyone elses, but yet all my scores are higher...it's not always how high you over clock, but what you know. and cpu over clocking is far more fruit full than gpu over cocking. if i had that...i would be well into the 18 or 19 k range with this sytem alone. -
3DMark06 score discussions are really fruitless though because it does not translate into gaming performance because of the overweighted CPU.
Johnksss's 3DMark06 SM 3.0 subscore is 85% of what my stock desktop 4870 subscore is, at same CPU MHz. However, his Crysis benchmark is only 55% of what my stock 4870 score is, at same CPU MHz.
The point I'm trying to make and have been trying all along is not to say that the 9800M GTX is a bad card; it's a good card, and johnksss has a good system. I'm trying to make the point that ATI's 48xx cards are actually what they're named and they game like they should. Synthetic benchmarks do not do them justice at all. -
no one has gamed with the 4870m's yet...still waiting on that. also waiting on them to run vantage mark, since vantage mark doesn't rely on your cpu, but what your cards can do on their own....
now my gtx280 performs pretty well in gaming and benchmarking.
looking at your over clocked q6600...drop it back down to 2.4 and then run them test again. matter of fact...i have the same cpu you have and i did a crysis test with it. old crysis before the patch.
let me know how far behind you i am...im curious now..
and no i7 core to break 20k, just need an over clocking laptop like the asus to do it....it's all about the numbers and how well they are routed...short verion of the story.. -
Red_Dragon Notebook Nobel Laureate
Its amazing how close to Desktop GPU's these cards are great job
Now hopefully the G280 wont cost an arm and a leg(it probably will) -
-
Red_Dragon Notebook Nobel Laureate
How long does it take for crysis to crash? It is right away or does it take time?
-
-
side note...when comparing this percentages stuff...one has to realize that in the gpu card game.....5 to 20 percent is a mjor gain in performance whether you realize it or not. we are tying to gage in in normal standards and that isn't how gpu's compare in reality....
5 percent can mean 3 more frames at very high crysis, which could mean that you now get 33 frames instead of the other guy getting 30 and with that slight increase can make or break ones gaming experience while online.
only 100 percent increases one will see is adding 2 and three cards to a system... now that more and more people are getting into mobile gaming...they will in the next few years...start listen to people like us telling them to quit bogging the cards down!!...lol by that time....they will be almost direct compares.....(speculation of course) -
dondadah88 Notebook Nobel Laureate
i have crysis patched with 1.2 what settings and do you want it at?
-
-
1680x1050, all High, no AA/AF.
That's one setup almost all gaming notebooks can run so it's useful for us, and desktops too.
Regarding the crashes, my screen goes black and it never opens. I might need to revert to Catalyst 8.12. -
just not sure if i really want to spend a few extra hundred to find out right now.
-
-
dondadah88 Notebook Nobel Laureate
ok i'm running the test now all on high.
-
dondadah88 Notebook Nobel Laureate
NEXT BENCH RUN- 2/20/2009 12:14:58 AM - Vista 64
Beginning Run #1 on Map-island, Demo-benchmark_gpu
DX10 1680x1050, AA=No AA, Vsync=Disabled, 64 bit test, FullScreen
Demo Loops=3, Time Of Day= 9
Global Game Quality: High
==============================================================
TimeDemo Play Started , (Total Frames: 2000, Recorded Time: 111.86s)
!TimeDemo Run 0 Finished.
Play Time: 98.36s, Average FPS: 20.33
Min FPS: 13.30 at frame 1956, Max FPS: 32.70 at frame 992
Average Tri/Sec: -14584935, Tri/Frame: -717304
Recorded/Played Tris ratio: -1.28
!TimeDemo Run 1 Finished.
Play Time: 83.80s, Average FPS: 23.87
Min FPS: 13.27 at frame 1965, Max FPS: 36.86 at frame 82
Average Tri/Sec: -17096436, Tri/Frame: -716326
Recorded/Played Tris ratio: -1.28
!TimeDemo Run 2 Finished.
Play Time: 82.99s, Average FPS: 24.10
Min FPS: 13.27 at frame 1965, Max FPS: 36.86 at frame 82
Average Tri/Sec: -17280000, Tri/Frame: -717025
Recorded/Played Tris ratio: -1.28
TimeDemo Play Ended, (3 Runs Performed)
==============================================================
Completed All Tests
edit: i edited it about 5 times sorry but it had all my benchmarks i ever did on crysis but it's final now i'll do the 1200 now. -
I got Crysis working on Windows 7 but from a bit of Googling it appears it suffers a bad performance hit in DX10 due to the nature of the beta. I'm having some wonky driver issues in general.
Windows 7 - DX10 1680x1050 AA=No AA, 32 bit test, Quality: High ~~ Overall Average FPS: 39.965
Windows 7 - DX9 1680x1050 AA=No AA, 32 bit test, Quality: High ~~ Overall Average FPS: 43.355
XP Pro - DX9 1680x1050 AA=No AA, 32 bit test, Quality: High ~~ Overall Average FPS: 47.065
I'm chalking the first two up to drivers and Windows 7 flukes. I was able to repeat on multiple occasions my XP DX9 score. -
On Xoticpc it say the laptop has a S/PDIF Output, does that mean it uses mini spdif out of one of the ports?
-
[email protected] Company Representative
http://en.wikipedia.org/wiki/TOSLINK
They usually are included with the optical digital cable -
you sure your crossfire is working?
-->*OFFICIAL: MSI GT725 Owner's Lounge*<--
Discussion in 'MSI' started by faisalhero, Feb 4, 2009.