<!-- Generated by XStandard version 1.7.1.0 on 2006-10-19T14:54:34 -->In this article MXM-Upgrade's Kris Verbeeck does benchmarking of different graphics cards in the same notebook to give an apples-to-apples comparison of several MXM compatible notebook graphics cards.
When one wants to buy a new graphics card or is considering the purchase of a sparkling new gaming system, be it desktop or notebook, the prudent buyer often resorts to checking benchmarks rather than putting one's trust in the enthusiasm of fanboys or marketing slogans. For desktop parts, this is a fairly straightforward operation. There are numerous sources of benchmark information, often from respected sources such as Tom's Hardware, Anandtech, etc.
Benchmarks for notebook components are also available in abundance, from sources such as here at NotebookReview.com, NotebookForums.com, LaptopVideo2go.com and many, many more. Unfortunately, they often all share the same, fundamental flaw: Graphics Processing Units (GPUs) can often not be swapped, or if they can, no compatible GPU is available. Two GPUs are often compared in entirely different systems. Some differences can be taken into account, albeit with a high degree of inaccuracy and uncertainty. One can add or subtract a bit for the system with the slower RAM, different CPU architecture, slower harddrive etc. The influence of these parts is often limited compared to the performance of the GPU, but still. Unfortunately, the problems don't end there. A simple driver difference can influence the end results. Perhaps one person's system is running a hungry process in the background, maybe his harddrive is running out of breath. The amount of possible party poopers is endless and they may all result in a final judgment or choice that is off the mark and doesn't reflect the true GPU standalone performance.
However, as primary and only source of MXM cards on the net, I find myself in an unprecedented and privileged situation. I have access to a a wide range of MXM cards and I have a notebook that can handle them all. The plan to compile a set of GPU benchmarks that is unparalleled and unprecedented as far as consistency, reliability and representability go formed in my mind some time ago. At the time I had an Nvidia 6200 32MB/TC, a 6600 256MB and a 6800 256MB at my disposal, which I judged to be too limited to kick off the project. When MSI Europe offered me the chance to try an Nvidia Go 7600 256MB Type II card, I knew it was time. The card will be offered in their upcoming MSI L735 and L745 series of notebooks, aka the M1042 and M1039. I am happy to report that the card runs very stable and performs very well, even though the card I received appears to be an engineering sample as is evidenced by this picture you see just below.
I also briefly benchmarked the ATI X1800 some months ago. I have added these results, but it is advised to consider those results to be less reliable as less time was available. Furthermore, the set of benchmarks is not complete for this GPU.
You'll note that all benchmarks come with placeholders for 6600 and x1800 cards. These are scheduled to pass my benchmarks in the next few weeks and they will be added to this article as soon as they are available.
Test System
Fujitsu-Siemens AmiloM 3438
- Hard Drive: Samsung HDM060II (60GB)
- Processor: Intel Pentium M 1,73GHZ (2MB L2)
- Memory: 2 x 256MB DDR (553MHz)
- Screen: 1440 x 900 screen resolution
- Driver: 84.71
As can be seen, the memory is obviously a weak part in this system. As the Farcry gaming benchmark required more than the available 512MB, I assume some performance was lost there. This is a previous generation system, and hardly the fastest in that category. I leave it up to the specialists to argue about it, but the advantage modern dual cores have over this an aging Pentium M is obvious. But in the end, I was looking for an apples-to-apples comparison and that's exactly what you'll get here.
I used the driver of a newer FSC model, the Pi1547, as I know the model is very close to mine. It is not the latest driver, but a stable and good one that supports all the GPUs benched in this article after the .inf file was replaced with the corresponding modded .inf from my good friends over at LaptopVideo2go.
A more in depth view on my notebook can be found here.
Tests
- 3DMark01
- 3DMark03
- 3DMark05
- 3DMark06
- Aquamark3
- Farcry
- Metalgear
- X2
The entire range of available 3DMark test suites has been executed (01,03,05 and 06). Aquamark3 was the final synthetic benchmark. I added 3 "real life" gaming tests: Farcry, X2 and Metalgear. Farcry is based on the 'real' game, the rest are free benchmark demos that can be found on the Guru3D website.
I ran all 3DMark suites twice to make sure no large discrepancies exist between runs. This notebook 'features' the dreaded lagging issue, so when the effects of this issue were suspected, the benchmark run was omitted from the results and run again. The Farcry benchmark was programmed to run 5 consecutive times. This has some importance as I noted the first run was always a bit slower. I assume this has to do with parts of the level being loaded during the first run and being available during the following runs. The picture below fueled that assumption.
Note the harddrive LED being lit -- during the first run, a lot of disk access occured. During the following runs it all but vanished since it was loaded into memory. Furthermore, I noted that when running the benchmark as a whole twice, the second run was much faster (>10%). For this reason, I ran all the Farcry benchmarks twice and kept to the second run. The X2 benchmark was run only once but consists of multiple levels. I ran the Metalgear only once because, well, I was getting bored and tired of running benchmarks! All benchmarks results are available for download.
I decided against adding thermal readings as the thermal setup of all cards has not been equal. I ran all cards with the original Fujitsu-Siemens heatpad, which isn't exactly a performance stud as it is and it degraded after being applied an reapplied several times. On top of that, I was forced to run most cards without heatpads between the heatsink and memory BGAs. While this hasn't affected performance as far as I know, it certainly has an influence on thermal performance. To make sure no thermal throttling occurred, I ran 3DMark06 with Rivatuner in the background and checked on the logs after the run.
To make up for that, I added some power readings for each card. The first one is the ampère reading on an idle desktop and the second one is while running the 3DMark06 benchmark. I removed the battery for this test because the battery charger takes some power as well. The displayed or mentioned reading is what I noted as 'average maximum reading'. Not exactly a scientific approach and it is suggested to take these measurements as an indicator, not so much as final judgment. To get an idea of power consumption, multiply the reading by 20. As voltage tends to fluctuate under different load conditions, this is once again not a scientific approach but an indication.
Nvidia Go 6200
Find anything missing?
Yeah, this card sports a whopping 16MB DDR. One single chip. One. The technology that 'enables' this is called 'TurboCache' or 'TC'. The idea is anything but new, but regained some popularity after the introduction of PCI Express. While AGP features less bandwidth from the GPU to the CPU as the other way around, PCI Express allows for the same transfer speed in both directions. This makes the idea of sharing system memory with the GPU slightly less insane, but the fact remains that you're not only looking at less performance for the GPU, but that you're crippling your CPU's performance while you're at it. While this may work reasonably well for expanding 128MB of DDR in the rare cases where more memory is required, it simply stinks for a card with only 16MB of dedicated RAM. As will be evidenced by the benchmarks.
3DMark01: First run 6023. Second run 6004.
Average: 6013
3DMark03: First run 1787. Second run 1791.
Average: 1789
3DMark05: First run 881. Second run 881.
Average: 881
3DMark06: First run 238. Second run 238.
Average: 238
Aquamark3: First run 17613. Second run 17595.
Average: 17604
Farcry: 10,29fps
X2: 3.211fps
Gunmetal: Benchmark1 average: 4.35 fps Benchmark2 average: 4.49 fps
Benchmarking a slow card is never much fun. Benchmarking a very slow card even less so. That said, it offers some premium benefits not found with other cards. It allows to manually verify framecounts (slowly count the frames) and you can enjoy the craftsmanship the designers have put to use while adding detail and effect to the game. Needless to say that after running hours and hours of benching, these benefits start to pale. Imagine my relief when the final Gunmetal benchmark was run. That benchmark was a very long one and the 6200 logged another abysmal performance.
Imagine my horror when I noticed the Gunmetal benchmark was run at 852 x 480 and had to be done all over again. Anyways, I digress.
The Rivatuner report can be found here. Highlights:
- 300MHz Core
- 300MHz(600 effectively)memory
- 32-bit memory bus
- 16MB DDR
- 4pp,3vp
With the 6200, the Amilo consumes about 1,39 amps idling on the desktop. While running 3DMark06, the 'highest average' was 2,48 amps .
Nvidia Go 6800
I shipped my own Amilo 6800 to a costumer to make his life a bit easier, so I now have one of my own 'Amilo Special' 6800's fitted. It works like a charm. This is the 'original' configuration of this notebook, what it is designed for. One problem with the card I sell is that they lack the specific mounting provisions for heatsinks of various notebooks. The second picture below shows an easy albeit hardly beautiful solution. I used an XL screw for the picture. The screw goes through the heatsink and is bolted down. This procedure works just fine but does bring the risk of warping the card if too much torque is applied. To the left you'll also notice the two metallic blocks (coils) that protrude from the surface.It are these components that commonly require the heatsink to be dremmeled, a small yet annoying task.
3DMark01: First run 18978. Second run 18777.
Average:19377
3DMark03: First run 9447. Second run 9412 .
Average: 9429
3DMark05: First run 4103. Second run 4094.
Average:4098
3DMark06: First run 2102. Second run 2101.
Average:2101
Aquamark3: First run 56866. Second run 56700.
Average: 56783
Farcry: 52,77 fps
X2: 52,866 fps
Gunmetal: Benchmark1 average: 40,45fps Benchmark2 average: 45.49fps
The Rivatuner report can be found here. Highlights:
- 375 MHz Core
- 300MHz(600 effectively)memory
- 256-bit memory bus
- 256 MB DDR
- 12pp,5vp
The 6800 logs 3.84 amps on 3DMark06 (3,78 pictured). Interesting to see is how the influence of the GPU on idle consumption is limited at best. Or rather, the difference drops below the level of significance these rudimentary test can give you. I guess this warrants a 'job well done' for the nVidia engineers.
Nvidia Go 7600
As said, we owe a special "thank you" to the folks of MSI for sending us an advanced sample of their MXM lineup. As MXM reported earlier this year, MSI www has plans with MXM. So when I found the card in the mail, I was anxious to find out exactly how far they advanced. When I opened the package, I was a bit puzzled by the fan on the GPU. When I compared the picture to some I took earlier this year at CEBIT, things cleared up a bit. While this particular fan can be found on some chipsets on MSI motherboards, it was also on the MXM cards that I saw on the Geminium technology demonstrator. For the time being, please don't ask if and when we'll be able to offer these cards. We currently simply don't know.
It can be assumed this particular card served some time or was scheduled to do so on the Geminium platform. While the actual or commercial purpose of the Geminium is not exactly clear, I think it would be a fair guess to assume it serves at least as a lab tool to debug MXM cards for notebooks. Indeed, it allows notebook designers to separate the designs of a notebook and their graphic subsystems. This allows the development of one to advance even if the other is bogged down. .
We removed the fan with some difficulty as it was attached with pretty strong double sided thermal tape. After clearing the drab, the GPU was revealed. A first sign of the fact that I had a good performer in my hands were the memory chips: ready to be clocked at 500MHz (1,000Mhz effective). As with most 'bare' MXM cards, including mine, this card did not come with mounting provisions, so I added a simple screw and a bolt. It should be noted that this approach works perfectly well, but as the heatsink is not 'stopped' by any support, there's a risk of bolting it down too firmly and warping the MXM card. Common sense is a robust solution for this. And then... Benchmark time!
3DMark01: First run 19190. Second run 19016.
Average: 19103
3DMark03: First run 9281. Second run 9255.
Average: 9268
3DMark05: First run 4120. Second run 4019.
Average: 4070
3DMark06: First run 2235. Second run 2231.
Average: 2233
Aquamark3: First run 58533. Second run 58481.
Average: 58507
Farcry: 57.15 fps
Never mind the fact the bench says I had a 6800 installed. I can assure you it was the 7600.
X2: 49.413 fps
Gunmetal: Benchmark1 average: 40.55 fps Benchmark2 average: 45.52 fps.
The Rivatuner report can be found here. Highlights:
- 445.5MHz Core
- 500MHz(1000 effectively)memory
- 128-bit memory bus
- 256 MB DDR3
- 8pp,5vp
With the 7600, the Amilo consumes about 1,45 amps idling on the desktop. While running 3DMark06, the 'highest average' was 3,2 amps (3,15 pictured).
The palmrest area does indeed seem to heat up less.
Nvidia Go 7900GS 512MB
Needless to say we are very glad to be able to offer these cards. There's a waiting list and supply is erratic at best, but at least the show is running!
This one is once again based on the 'Arima Special' formfactor, meaning this will cause additional problems for some and it will offer a unique opportunity for others. Mounting provisions are again lacking and as often seen with other cards some of the components on the card (coils, to be precise) may force you to mill your heatsink with a Dremel or equivalent tool.
Please note the aluminium heatsinks+heatpads on the bottom memory banks.
A more thorough review will shortly be published on MXM-Upgrade.com. At this point, I would like to add however that as the fan did not respond to the rising temperature of the card, I forced the fan using the ACPI functions of Notebook Hardware Control. As such, NHC was always running in the background.
Well, there's little more to say. On to the benchmarks!
3DMark01: First run 21089. Second run 21023.
Average: 21056
3DMark03: First run12610.Second run 12708.
Average: 12659
3DMark05: First run 5807.Second run 5809.
Average: 5808
3DMark06: First run 3083. Second run 3061.
Average: 3073
Aquamark3: First run 65028. Second run 64904.
Average: 64966
Farcry: 57.15 fps
X2: 68.317 fps
Gunmetal: Benchmark1 average: 46,52 fps Benchmark2 average: 48,31 fps.
Average: 47,42 fps
The Rivatuner report can be found here. Highlights:
- 350MHz Core
- 500MHz(1000 effectively)memory
- 256-bit memory bus
- 512 MB DDR3
- 20pp,7vp
The fairly low core clock surprised me a bit, especially compared to the 450MHz the 7600 was running at. Another nice feature is the automatic throttling as featured by the 7900GS...
With the 7900GS, the Amilo consumes about 1,80 amps idling on the desktop. While running 3DMark06, the 'highest average' was 3,76 amps (3,71 pictured). The reason for throttling clocks while the GPU is idle becomes quite clear. It consumes enough as it is,which is not surprising. Power under load is about the same as a vanila 6800, which is nice for people considering the upgrade
Results Graph Comparison
The 6800 and 7600 are quite on par, while the ATI x1800 trumps them where measurements were available. In fact, the numbers differ so little that it is to be considered non-significant. The 6200 is not capable of putting up a fight, a fact that is increasingly highlighted in the more recent synthetic benchmarks.
The Nvidia 7900GS smokes them all, which is hardly surprising. A mild eyebrow raiser was the relative small difference in 3DMarks01. The same can be seen with the Aquamark benchmark. Both tests are ageing and perhaps the CPU was holding them back a bit.
The synthetic benchmarks are duplicated in the game benches with the 7600 and 6800 trading first places and the 6200 not even in the picture.
The Farcry benchmark of the 7900GS is somewhat of a mystery. I assumed this would be the benchmark where the card would shine, but instead it logged a rather bad performance. I do not have answers for this one, but it might have something to do with the system running into the CPU ceiling, brick house style. Another possibility is that it relies more on the Vertex Processors than others. Basically, the 7600 runs 5 pipes at 445MHz, the 7900GS 7 at 350. Simply multiplying gives a maximum performance gain of 9% if this would be the botlleneck of the system. Still considerably better than the 2.5% the system logged.
This was both a surprise and a dissapointment.
It came as a bit of a surprise that the Idle measurements differed so little, except for the 7900GS. These numbers would seem to suggest that for normal webrowsing, text editing, watching movies etc will not affect battery life that much altogether. We are looking forward to log more measurements to check this out. The measurements under load only acknowledge what one can assume, with a difference of about 26Watts between the 6200 and 6800. The 7900GS nicely fits the thermal envelope set by the 6800.
-
Superb article!
-
Charles P. Jefferies Lead Moderator Super Moderator
Extremely thorough and informative article Kris. I found the MSI sample MXM card to be especially interesting - I wasn't aware that specific vendors would be producing them.
The Go6200 is sad indeed - 16MB of dedicated RAM and a 32-bit bus. The Go7900GS 512MB is a killer though, and at least it has a reasonable power consumption. -
Sooo, exactly how many hours or days did you spend compiling all these benchmarks and getting everything setup? This is an amazing amount of work you did to gather all the data and give a visual display of how graphics cards have progressed. Well done.
-
-
-
When you talked about MSI's "MXM Linup" did you mean to say that they will be producing retail cards? This would be incredible, and mean that to get say a 7600 I wouldn't have to pay ungodly amounts to have one shipped from Europe. I have been watching carfully for this oportunity so it will be welcome. But great article though!
-
That said, shipping is quite ok from Europe. I currently pay about the same for the US as Europe. As I am transforming MXM Upgrade from a private project into a company, that might change to cover the export procedure costs. But if it turns into a company, I can also drop the 20% sales tax for US costumers, so that should level things a bit. -
Well, i have a question, While you stated the 6200-->6800 Power consumption was a 24w?(or was it 26...) differance, you never stated the 6200's base usage in the first place, nor the 7600's which would make an interesting comparison, and indeed, not the 7900gs's either, although i assume it would be simular to the 6800 if it can fit the same thermal...
Of course, i could calculate most of this myself if i knew the voltage they draw, being that you gave the Amprage... >.> cursed watts law! -
Kris - that is what I was searching for. Excellent article - goes right into my favorites. I might even order some card from you.
Cheers,
Ivan -
So, in short, it's an indication of GPU power consumption, not a number written in stone.
That said, as the article states, the nominal voltage is 20V. Again, voltage may drop a bit under heavy load and that has not been taken into account.
Seriously, though, your notebook can probably not take more than a Type II card, so you'll have to wait untill I can get my grabby hands on some 7600's before I can offer you anything. -
Yep - type II , but now when I am looking at your pics of undoubtably III cards - they look like they would fit too. And my dealer offers only 6600 128MB, x700 128MB and 6600 256 MB cards - so I guess they are all II and that is biggest format that would fit.
7600 sounds nice, and I hope the price will be too.
Cheers,
Ivan
P.S. Down is the pic of the 6600 in my dealers catalogue - I think it is type II.Attached Files:
-
-
The 7900GS is 'arima special' about as high as a Type II but 13mm wider (7 mm to the left and 5 to the right.
The 7600 is 'regular' Type II. -
If the position of the memory chips is the same or 5 mm wider, it would fit under my heatsink. I checked, and the heatsink can cover more space than my x700 is currently using. Once I decide to upgrade I will actually measure the positions.
Thanks.
Ivan -
7900GS... Drool...
On the other hand would it be pheasable in a notebook such as mine? Would the heat and/or power consumption be a problem? If not and if it would fit (I have yet to measure), I ought to start saving...
On the other hand I am quite pleased with my 6600 so a 7600 would be plenty I'm sure (actually if I had the 7900 my processor would probably be a bottleneck anyway).
PS Would it be pheasable wo get my 6600 onto the market somehow perhaps through you? I know it may not be possible as it is used but it would be nice to gain some cash back. -
There's currently a wee bit of discussion regarding your notebook, which is the same as Ikovac's, but a 7600 should slip in no problem.
-
Thanks I'll be keeping an eye out for any news.
-
Enlightening !
-
Got some info that may help to explain why the 7900GS craps out on Farcry.
It would seem the engine only turns on some eye candy (additional environment elements etc) when the graphics chip is up to it. Would explain a lot. I'll try to check it out, but my 6200 just spend some time in an iMac so that might take some time -
Ok. Finished benching the 6600 and x1800. I`ll be adding results this weekend. That will bring the list of GPU`s to 6200,6600,6800,7600,7900GS and x1800
Ice-Tea -
I added some benchmarks. Now included: 6200,6600,6800,7600,7900GS and x1800. I only changed the graphs, I did not add or change text and neither did I put the in depth section of the 6600 or x1800. A bit too much work. For the latest, up to date version please check MXM Upgrade
During the last test I did with my 3438, it died on me. This is why there are no power measurements availble for the 6600. This is obviously also the reason why this article won't be updated anymore in the future.
An Apples-to-Apples Comparison of MXM Notebook Graphics Cards
Discussion in 'Notebook News and Reviews' started by Ice-Tea, Oct 19, 2006.