by Dustin Sklavos
And so it begins ... after the long hiatus, the Notebook Review Mobile GPU Guide's 2009 Edition is rushed out of the gates before Nvidia can make matters more complicated with more GPUs. I have to be honest, part of the reason you didn't see this article sooner is because, simply put, Nvidia created “brand spaghetti” in the marketplace. Despite their mobile parts only being based on a couple different chips, they have a full twenty-five (25!) parts in circulation right now, and that's not including the recently announced G200 line which I'll talk about in brief towards the end. ATI's not doing too much better at twenty variants circulating, but their parts are far easier to keep track of and much easier to describe.
Because of the radical changes to the graphics market since my last guide, I have to revamp my approach to notebook Graphics Processing Units (GPUs). Pipelines and all that garbage are past tense now, with almost all mobile parts now using the unified shaders that DirectX 10 mandated in Windows Vista. And because of the product flood in the notebook market, I'm going to take a different approach and instead organize mobile graphics cards by the chips that power them. Especially when you hang out in the Nvidia sections you'll see exactly how practical this approach is.
Finally, before you get into this guide you may want to have a look at my “How it Works” entry on mobile graphics.
NO YOU CAN'T UPGRADE YOUR NOTEBOOK GRAPHICS
You cannot. Stop posting in the forums. Stop asking about this. You just can't. Moving on.
UNIFIED SHADERS
Windows Vista brought with it DirectX 10, and with DirectX 10 came a completely new approach to handling shaders. Gone are the distinct pixel and vertex shaders, replaced by unified shader technology that's much more flexible.
With each GPU I'll be noting the number of unified shaders in that part, but I want to make clear that Nvidia and ATI use completely different approaches to their shader designs. For example, Nvidia's top end part on the desktop has 240 unified shaders, while ATI's has a staggering 800. If you look at the raw numbers, the ATI part should be monumentally faster, but the designs of the shaders are actually radically different and as a result, Nvidia's top end outperforms ATI's. Thus, the number of shaders should only be used to compare same-branded parts and not ATI vs. Nvidia.
More is, of course, better, but will also draw more power and throw more heat.
MEMORY BUS WIDTH AND TYPE
One thing that hasn't really changed much in the time past is memory bus technology. In general, you will see three different memory bus widths on mobile parts: 64-bit, 128-bit, and 256-bit. Parts worth gaming on will generally never have a 64-bit memory bus, which is the thinnest and slowest. A 256-bit bus, on the other hand, is much more expensive to produce and so will only appear on absolute top end cards. The happy medium is often a 128-bit bus.
There are also four types of memory in circulation for mobile graphics. The first three differ generally in the top speed they can run at, while the fourth is newer and very different from its predecessors.
These first three are, in order of performance capacity, DDR2, DDR3, and GDDR3. Many manufacturers will mix up “DDR3” and “GDDR3,” and for the most part that's okay as they'll have pretty similar performance characteristics. DDR2 is the slowest by a mile and on most parts is going to be the second biggest performance bottleneck, next to the memory bus width. If you're going to be gaming, you'll really want to avoid DDR2 if possible.
The fourth and still somewhat rarefied memory technology is GDDR5. GDDR5 actually runs at a quadruple data rate instead of double like the other memory technologies, and can produce mountains of bandwidth. The use of GDDR5 almost effectively works as a jump in memory bus width. GDDR5 used on a 128-bit bus can produce memory bandwidth comparable to a 256-bit bus, and on a 256-bit bus can produce staggering bandwidth comparable to a 512-bit bus! As someone who actually has a desktop card using GDDR5, I can say it works pretty much as advertised; when tweaking the clock speeds on my graphics card, the memory speed is almost never the bottleneck.
COMPARABLE DESKTOP PART
Outside of the new G200 lineup that Nvidia has recently announced, mobile GPUs are always cherry-picked desktop GPUs. It's the exact same silicon with tweaked clock speeds. As a result, each mobile part has a desktop analogue that it can be compared to. Since reviews of mobile graphics are so rarefied (I try to do my share but it doesn't seem like enough of them ever pass through my hands), it can be helpful to be able to search for a desktop part and at least get a ballpark figure of how the mobile part you're looking at will run.
DIRECTX 10.1 vs. PHYSX/CUDA
One of the big differences between ATI and Nvidia right now are the technologies they're pushing to compete with one another. ATI has been the only vendor up until this point (the point of Nvidia's announced G200 parts) that produces DirectX 10.1 (as introduced in Windows Vista SP1) compatible parts. DirectX 10.1 support has been fairly rarefied, with the most notable introduction so far having been Ubisoft's Assassin's Creed. If you're looking to buy that particular game, do not buy the Steam version. Instead, buy a retail copy and do NOT patch it. Ubisoft removed DirectX 10.1 support they claimed was buggy (it wasn't) in a patch, and with that support in place, ATI cards have a massive performance advantage against the competition. Outside of this instance, DirectX 10.1 hasn't been terribly relevant.
But then again, neither has PhysX. On-chip PhysX is only usable on Nvidia's higher end parts, and can add additional detail to games that support it, like realistic cloth, breaking glass, etc. Unfortunately, like DirectX 10.1, PhysX hasn't proved remarkably compelling either, with the only notable title using PhysX hardware acceleration being the game Mirror's Edge.
Alongside PhysX in the Nvidia corner is CUDA, which is Nvidia's general purpose GPU computing platform. CUDA is seeing decent support and may be of interest to videophiles, where GPU-accelerated video encoding can produce healthy performance gains in CUDA-enabled applications. That said, I edit video on my desktop and have yet to have seen a need for a CUDA-enabled application. More than that, CUDA's shelf life may not be that long with the OpenCL standard beginning to surface. OpenCL is similar to CUDA, except that it's platform-independent. I can't imagine developers playing the vendor lock-in game and only using CUDA when OpenCL (and even Microsoft's upcoming DirectX 11 Compute Shaders) can run on either company's GPUs.
These are things to be aware of, but they shouldn't affect your decision.
MOBILE DRIVERS
This, on the other hand, probably should impact your decision process. As much as I have a stated preference for ATI's hardware, they're woefully behind on the front of providing unified mobile drivers. The Nvidia user is going to be able to update his or her video drivers with new releases (meaning new fixes and performance improvements) just by visiting Nvidia's site and downloading new ones. ATI users aren't so fortunate; if they want to update their drivers they have to either rely on the notebook manufacturer to update (good luck with that) or use third party software to modify desktop drivers (a chore).
I don't have too much of a problem doing the latter, but it can be a real headache for the more novice users, and for that reason I would tend toward recommending Nvidia's mobile hardware for the time being until ATI can pick up the slack and make mobile drivers available on the ATI website.
CROSSFIRE AND SLI
Both ATI and Nvidia have multi-GPU solutions for notebooks that will, with two exceptions, only appear in massive desktop replacement units. ATI's is called Crossfire; Nvidia's is called SLI. Please note that these solutions typically don't bring a linear performance improvement; two GeForce GTX 280Ms aren't going to run twice as fast as one, as latency and driver optimization come into play. With this technology, the aforementioned driver situation becomes more important ... because if a game isn't properly profiled by the vendor in question the game won't reap the benefits of SLI or Crossfire.
Now, those two exceptions: ATI and Nvidia both have integrated graphics parts that, when combined with a low-end discrete part, can be used in Crossfire/SLI and thus improve performance substantially. These solutions are still nowhere near as good as mid-range and higher options, but they're also economical and good for battery life. Nvidia's solution with the GeForce 9400M, in particular, can also swap between a mid or high-end discrete part to the IGP when the notebook is running on the battery, resulting in substantial power savings.
A BRIEF NOTE ON INTEL
Planning to play cutting edge 3D games? Excellent! Don't buy anything using Intel graphics. Intel's integrated graphics performance is historically poor and rife with compatibility issues. When you're looking between Intel parts, you're really only dealing with levels of unplayability.
A BRIEF NOTE ON VIDEO ACCELERATION
In addition to being miserable for gaming, Intel's parts outside of the 4500MHD are also the only ones in the lineup (excepting ATI's Radeon X1200 integrated graphics series) that don't support high definition video decoding and acceleration. All other parts are designed to offload high definition video playback from the main processor to the GPU.
COMPLAINT DEPARTMENT
Finally, I'd just like to thumb my nose at all three graphics vendors (ATI, Nvidia, and Intel) for their complete lack of consumer-oriented business practices. ATI's mobile graphics driver situation is a nightmare; miles behind Nvidia's driver support. ATI's marketing department isn't doing them any favors either; Nvidia routinely works with game developers to make sure games run well on their hardware, and their “The Way It's Meant to be Played” program is everywhere. Whether Nvidia pays developers to cripple games or not (see the Assassin's Creed controversy), ATI's not out there hustling.
Intel's driver situation is, if at all possible, substantially worse than ATI's. I'm fairly certain their graphics driver team is either one over-caffeinated teenager in a basement somewhere, or a bunch of programmers that weren't good enough to code for Creative (at least two readers should laugh at this one). Intel has basic compatibility issues with games, and they've made promises about basic performance in their hardware that they have failed to keep. Marketing lies, but Intel's integrated graphics are still essentially broken as far as I'm concerned.
Finally, whomever is responsible for Nvidia's mobile graphics branding needs to suffer at the hands of angry consumers ... or just be fired. It's not bad enough that the market is over-saturated with mobile parts that are essentially the same but named differently, but the brands of their mobile parts almost never line up with their desktop ones. The most egregious offenders are the GTX 280M and 260M, which are actually just G92 silicon -- in other words, these are not mobile versions of the desktop GTX 280M and 260M, which are worlds more powerful.
CONCLUSION OF PART I[/p]At this point we've covered most of the basics in terms of mobile GPUs. In part two, I'll cover the individual hardware and talk about the GPUs, what performance class they should find themselves in, and what your bare minimum for gaming ought to be. Stay tuned.
-
Dustin Sklavos Notebook Deity NBR Reviewer
-
RE: Tigris/880G/785G
amd are going to look like a right bunch of Burks if they release a 40 shader chip on 55nm now that nVidia's Ion2 IGP will come with twice the shaders of its predecessor and fabbed at 40nm.
they owned the IGP market for ages after the 780G was released, seems strange that they are willing to throw that away.......
if the Tigris integrated GPU genuinely was a 4xxx series GPU with 80 shaders then they could have ruled the low-end/light-gaming crowd, especially with the new 45nm PII derived CPU. they baffle me sometimes. -
Could you do a little thing about ATI and NVidias CAD gpus?
-
data isn't exactly widely available for the CAD Gpus and they are usually also horrendously expensive so its somewhat reasonable to understand they might not be covered. just letting you know.
-
How do you rate 4500MHD if it is not used for gaming and just for occasional HD movie ?
-
-
Hi,
Thanks to your post I realized, that a 'mobile graphics' reference in a post's title will ultimately spur my interest, and at the very least make me think twice about resisting to open it. A lesser geeky thing that is...
Take care,
AdiQue
PS Kudos for Chaz [yes, in advance] for not hating on me! ;]Last edited by a moderator: Jan 29, 2015 -
I love how you say that "YOU CAN'T UPGRADE YOUR NOTEBOOK GRAPHICS" but you have a picture of an upgradeable MXM graphics card right in your article!
Besides, with more companies buying into the MXM form factor (Acer now the biggest supporter), that tired old cliche deserves to be retired, or at least modified.
UNLESS YOU ALREADY HAVE A DEDICATED GRAPHICS CARD AND OWN THE RIGHT BRAND OF LAPTOP, NO, YOU CANNOT UPGRADE YOUR GRAPHICS CARD.
That should still shut up the people asking if they can upgrade their Intel GMAs, and the question of "what is the right brand of laptop?" can be directed here.
Otherwise, great article. -
Jerry Jackson Administrator NBR Reviewer
I mean, how many LEGITIMATE stores do you see selling new MXM cards so that you can swap your old MXM card out for a new card? Random ebay sellers don't count. Even the handful of notebook manufacturers that sold/sell notebooks with MXM cards don't provide an option for you to buy a newer MXM card from them and upgrade your existing notebook.
Bottom line, although the technology exists there isn't a practical solution for upgrading notebook graphics. -
MXM-Upgrade.com and MXMVideoQuest.com are the only two quasi-legitimate stores I can think of.
And yes, it's unfortunate that MXM has become more of a boon to ODM/OEMs than to the end users, but that's the way things go. Especially considering that even the manufactuers that use MXM would void your warranty faster than you could say "I upgraded my graphics card..."
I guess for a blanket statement it still works, but those of us on NBR-Acer reserve the right to upgrade our MXM cards as we please. -
Yep, they've all got problems. At least I have a somewhat straightforward Nvidia GPU in my laptop, and a nice ATI card in my desktop. Intel just deserves to be dragged out and shot.
-
-> Here!<-
-
allfiredup Notebook Virtuoso
The powers-that-be at nVIDIA need to be flogged for their nightmare of a model naming scheme!
They also deserve a few extra whacks for creating models that are almost identical. Example- the 9200M GS and 9300M GS- the only difference is that one has a 1300MHz Shader Speed and the other 1400MHz. They perform almost identically- comparing same memory size/type of each model.
I also think that making verions a particular model in both DDR2 and DDR3/GDDR3 is very frustrating! The performance differences are substantial, as noted above, but even most savvy consumers aren't sure which 'version' they're getting! ATI and nVIDIA are both guilty of this.
For example, the ATI HD 4570 is available with either DDR2 or GDDR3 memory. The DDR2 memory speed is clocked at 500MHz, while the GDDR3 memory speed is 680MHz. Comparing 512mb versions of each, the difference in 3DMark06 performance is well over 1,000 points! Why not just call the DDR2 version the 4550 and the GDDR3 version the 4570???
Another example- the Dell Latitude E6400 and E6500 both have the nVIDIA Quadro NVS 160M, but the E6400 has the DDR2 version (400MHz memory speed) and the E6500 has the GDDR3 version (700MHz memory speed). There is no mention of this on Dell's website, though. The difference in performance is almost 600 points on 3DMark06.
And my final complaint, for the moment, is when the same GPU model is offered on the same model with different memory amounts. Case in point- the Dell Studio 1555- it can be ordered with the 256mb or 512mb version of the ATI HD 4570 (both GDDR3). Performance between the two are virtually identical in most tests. -
shoelace_510 8700M GT inside... ^-^;
-
-
I'm sure a lot of people would like to see someone at nVIDIA get flogged for their handling of the overheating GPU fiasco, too. Certainly something worthy of a mention in the complaint department - even if it is fixed now, it's a terrible business practice to completely deny such a widespread mistake like that, and something they may do again.
Then there's the nVIDIA driver BSOD problems. Mainly a problem soon after Vista's launch, but I'm sure there's still people who's like to see some floggings over that. I wouldn't mind it myself right now, having gotten two nvdisp.dll BSOD's in the past three hours, at least one of which was caused by an infinite loop - and that's on XP. Fortunately it's the most BSOD's from nVIDIA I've got in 3 hours in a very long time - possibly forever - but it's still a tad irksome, especially given that these particular drivers had only BSOD'ed once in 6 months before (and I wasn't even running the same programs the two times it did tonight).
Will be looking for part two. Off for now, as my battery's about to die! -
They need to fix their laptop card naming, yet they're fixing their perfectly understandable desktop CPU names. sad lol
Good article, and I can't wait for Part 2! -
Very nice article, finally some clarifications. Thank you and can't wait for part 2. -
-
allfiredup Notebook Virtuoso
Core/Shader/Memory Speeds
9200M GS- 550/1300/700
9300M GS- 550/1400/700
9300M G- 400/800/600
I found a few examples over at LAPTOP Magazine to compare the GS models-
Samsung X460 (9200M GS) - 3DMark06- 2082
Lenovo ThinkPad SL400 (9300M GS) - 3DMark06- 2191
I also found a review of the ASUS A6S with the 9300M G- 3DMark06- 1665 -
And just to confuse things more, the 9300M G is the rebranded 8400M GS. Remember THAT fiasco from when the 9 Series was just starting out?
-
I own a Sager 5793 with a 9800 GT video card in it. I cannot upgrade the video card on it because the cards used in the 5797 use the same interface but are different voltage wise from what I understand. But if the card fries, I can get a new one and simply plug in to replace at the cost of $400+.
As far graphics go the article is spot on. Avoid Intel like the plague. However if your using a machine with onboard graphics dont expect to play anything 3d heavy like a FPS with any quality. Stick to a high end Nvidia or ATI part if you want to do that. It will cost more, but its worth it if you use the machine for gaming.
For the guy asking about CAD. You can get a Nvidia Quadro in some laptops, most notably a Sager. Im not really familar with CAD and the hardware for it, but this may be an answer.
Unfortunately the interface for mobile graphics, MXM, is there, but it is by no means a standard as things change all the time from model to model and manufacturer to manufacturer.
As far as manufacturers go, Nvidia is simply die shrinking and renaming. Their 2XX mobile chips are nothing but a die shrunk 88XX mobile chip from a couple of years ago. The clock them up slightly and give them a rename and voila, new mobile chip. Their current desktop chip, the 2XX, simply use too much power and have too big a die for a mobile chip. ATI has pushed to 40nm and is actually making new chips but their driver support blows.
Good article, I look forward to part 2. -
Um, Kaldor? The newest of the new 200M series will be 40nm GT200 cores.
-
Great article, well done. Got a good laugh from the MXM card picture. Anyhow, I agree with the naming scheme. I used to understand all of ATI's and nVidia's lineup. Now the problem for me is the rebadging. I mean, a 9800m GT is an 8800m GTX, and 9500m GS is an 8600m GT, a 9300m G is an 8400m GS...seriously, what is the point of all this renaming. I understand most of the cards, but often I'll have to look up specs, something I didn't have to do before.
-
The bashing on Intel's graphics is very trendy. A few points to ponder:
1. Has anyone here noticed the battery life on Intel CULV laptops? Notice that AMD Neo doesn't come close? Wonder if there is a connection...
2. The phrasing on decode acceleration is accurate, if somewhat confusingly worded. Of course, all new laptops using Intel graphics do, in fact, ship with the Intel GMA4500. And the video quality on the 4500 is equal or better than competitive integrated and discrete parts. Moreover decode acceleration is only really needed for HD, BluRay - the cpu can handle decode of standard def content just fine (e.g DVDs).
3. Intel's graphics are paired with discrete graphics in several vendor's laptops in a switchable configuration - including my T400, Sony has a model - more coming this year to allow you the best of both worlds - Intel graphics for battery life, discrete for gaming with no reboot required.
4. While historically Intel's gaming graphics were riddled with compatibility issues and lousy performance, the story is getting better. For example, the G45 meets the minimum spec for Ghostbusters - must have met the publisher's playability targets... -
I have an asus n10j with nvidia 9300 G S . It has 16 cores.
Check this link also for proof.
http://forum.notebookreview.com/showthread.php?t=307767
I am alot more inclined to believe gpu-z then nvidia's site. Even the frequencies they give are far off from production cards, and you can see that even by comparing the scores you posted to the N10J card. At any rate my point was their naming scheme is causing so much confusion it's hard even for an enthusiast to know what he is buying. ATI is a little better but not alot. -
allfiredup Notebook Virtuoso
But I'm even more confused now- my Dell Latitude E6400 has the nVIDIA Quadro NVS 160M graphics card, which is a GeForce 9300M GS with drivers optimized for business apps. According to both nVIDIA and GPU-Z, it only has 8 shaders... -
Also here is a print out from everest regarding the new 105m:
Video Adapter nVIDIA GeForce G 105M
GPU Code Name G98M
PCI Device 10DE-06EC / 1025-0167 (Rev A2)
Process Technology 65 nm
Bus Type PCI Express 2.0 x16 @ x1
Memory Size 512 MB
GPU Clock (Geometric Domain) 182 MHz
GPU Clock (Shader Domain) 364 MHz
RAMDAC Clock 400 MHz
Pixel Pipelines 4
TMU Per Pipeline 1
Unified Shaders 16 (v4.0)
DirectX Hardware Support DirectX v10
Pixel Fillrate 728 MPixel/s
Texel Fillrate 1456 MTexel/s
As you can see it reports 16 shaders too. Maybe GPU-z could be wrong but everest is reporting the same thing. This card is supposed to be just an overclocked 9300GS and nvidia's site lists it as an 8-core card. I do believe it is an overclocked 9300GS but I also think whoever writes on nividia specs sheet should be given an ion netbook to play crysis on all day long as punishment for screwing product naming and specifications like that. Maybe they imagine nobody cares what they sell as long as the name is up to date.
I'm sad to say that between gpu-z, everest and nvidia's site I would go with the former 2 applications 99% of the time. So there you go, more proof for confusion -
allfiredup Notebook Virtuoso
A final footnote to my previous rants-
I thought that nVIDIA would at least NOT use model numbers in their new naming scheme that duplicate existing Quadro models! In particular, the GeForce GTS 150M and 160M...and the current Quadro NVS models are, surprise, the 150M and 160M!
It's kinda like how a Dell Inspiron 15 can be a 1525 or 1545, or the Studio 15 is a 1535, 1537 or 1555....
As they say, "common sense isn't very common"! -
Jayayess1190 Waiting on Intel Cannonlake
-
-
2. Yeah, you can watch DVDs on Intel graphics, but it falls flat on it face with anything else.
3. Who is making the the discrete card that Intel is going to allow on their closed platform? Are they using a MXM style card? Link please.
4. As far as minimum spec goes. Who games at minimum spec?
Intel graphics are fine if you never do anything other than work on the desktop in Windows. The cant do HD content, and they cant game other than at minimum settings and that is probably a stretch.
Pretty good read on the current state of performance graphics for laptops:
http://www.tomshardware.com/reviews/geforce-gtx-280m,2353.html -
-
allfiredup Notebook Virtuoso
Intel® Graphics Media Accelerator X4500HD (Intel® GMA X4500HD), includes built-in support for full 1080p high-definition video playback, including Blu-ray* disc movies. This powerful video engine provides users with a rich, new media experience to deliver smooth HD playback without the need for add-in video cards or decoders. -
Another interesting point is the chart at the top which I noticed before the author pointed it out in the article. The jump in power consumption when going from the 240m (55nm part, GDDR3) to the 250m/260m (40nm, GDDR5). I dont think TSMC has the 40nm quite right right, in spite of what they say, and they are experiencing leakage like the ATI 4770 desktop part. Couple this with GDDR5 which uses more voltage, the setup uses more power and thus runs hotter which is not good in a mobile device.
I still stand by that it most likely an offshoot of the low buck 40nm, DX 10.1 chip. Most likely very highly binned chip that is set aside for use in mobile devices and the others pushed toward desktop OEMs. Nvidia has been less than honest before. Whats to stop them now? -
-
4500MHD - HD playback if you've got everything set up just right.
GMA 900/950/X3100 - lol no. -
Jayayess1190 Waiting on Intel Cannonlake
-
That's not the GMA 950, that's your CPU. That's why all these netbooks with the Intel Atom and GMA 950 struggle to play HD content. We're talking about video acceleration using the GPU to assist your CPU.
-
Haha I laughed at the Creative joke!
-
-
The amount of GPU decode assist needed to play back a video depends on a number of variables- CPU speed, bitrate, and codec used. My 2.0 GHz C2D T7250 can easily play back a 1080i MPEG-2 at 19 Mbps without the GPU helping at all (XVideo). However, the CPU meets its match when playing 720p H.264 video. I can't play 1080-line H.264 or MPEG-4 on this machine until the decode assist for that makes it into a stable driver.
-
RE:
Sony Vaio VGN Z790DND - NVidia discrete graphics
I've heard rumors that several other OEMs are interested in pairing GMA4500 with discrete in mobile offerings as well.
Satisfied?
NotebookReview.com Laptop Graphics Guide 2009: Part One
Discussion in 'Notebook News and Reviews' started by Dustin Sklavos, Jul 8, 2009.