The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    NotebookReview.com Laptop Graphics Guide 2009: Part One

    Discussion in 'Notebook News and Reviews' started by Dustin Sklavos, Jul 8, 2009.

  1. Dustin Sklavos

    Dustin Sklavos Notebook Deity NBR Reviewer

    Reputations:
    1,892
    Messages:
    1,595
    Likes Received:
    3
    Trophy Points:
    56

    by Dustin Sklavos

    And so it begins ... after the long hiatus, the Notebook Review Mobile GPU Guide's 2009 Edition is rushed out of the gates before Nvidia can make matters more complicated with more GPUs. I have to be honest, part of the reason you didn't see this article sooner is because, simply put, Nvidia created “brand spaghetti” in the marketplace. Despite their mobile parts only being based on a couple different chips, they have a full twenty-five (25!) parts in circulation right now, and that's not including the recently announced G200 line which I'll talk about in brief towards the end. ATI's not doing too much better at twenty variants circulating, but their parts are far easier to keep track of and much easier to describe.

    Because of the radical changes to the graphics market since my last guide, I have to revamp my approach to notebook Graphics Processing Units (GPUs). Pipelines and all that garbage are past tense now, with almost all mobile parts now using the unified shaders that DirectX 10 mandated in Windows Vista. And because of the product flood in the notebook market, I'm going to take a different approach and instead organize mobile graphics cards by the chips that power them. Especially when you hang out in the Nvidia sections you'll see exactly how practical this approach is.

    Finally, before you get into this guide you may want to have a look at my “How it Works” entry on mobile graphics.

    NO YOU CAN'T UPGRADE YOUR NOTEBOOK GRAPHICS

    You cannot. Stop posting in the forums. Stop asking about this. You just can't. Moving on.

    [​IMG]

    UNIFIED SHADERS

    Windows Vista brought with it DirectX 10, and with DirectX 10 came a completely new approach to handling shaders. Gone are the distinct pixel and vertex shaders, replaced by unified shader technology that's much more flexible.

    With each GPU I'll be noting the number of unified shaders in that part, but I want to make clear that Nvidia and ATI use completely different approaches to their shader designs. For example, Nvidia's top end part on the desktop has 240 unified shaders, while ATI's has a staggering 800. If you look at the raw numbers, the ATI part should be monumentally faster, but the designs of the shaders are actually radically different and as a result, Nvidia's top end outperforms ATI's. Thus, the number of shaders should only be used to compare same-branded parts and not ATI vs. Nvidia.

    More is, of course, better, but will also draw more power and throw more heat.

    MEMORY BUS WIDTH AND TYPE

    One thing that hasn't really changed much in the time past is memory bus technology. In general, you will see three different memory bus widths on mobile parts: 64-bit, 128-bit, and 256-bit. Parts worth gaming on will generally never have a 64-bit memory bus, which is the thinnest and slowest. A 256-bit bus, on the other hand, is much more expensive to produce and so will only appear on absolute top end cards. The happy medium is often a 128-bit bus.

    There are also four types of memory in circulation for mobile graphics. The first three differ generally in the top speed they can run at, while the fourth is newer and very different from its predecessors.

    These first three are, in order of performance capacity, DDR2, DDR3, and GDDR3. Many manufacturers will mix up “DDR3” and “GDDR3,” and for the most part that's okay as they'll have pretty similar performance characteristics. DDR2 is the slowest by a mile and on most parts is going to be the second biggest performance bottleneck, next to the memory bus width. If you're going to be gaming, you'll really want to avoid DDR2 if possible.

    The fourth and still somewhat rarefied memory technology is GDDR5. GDDR5 actually runs at a quadruple data rate instead of double like the other memory technologies, and can produce mountains of bandwidth. The use of GDDR5 almost effectively works as a jump in memory bus width. GDDR5 used on a 128-bit bus can produce memory bandwidth comparable to a 256-bit bus, and on a 256-bit bus can produce staggering bandwidth comparable to a 512-bit bus! As someone who actually has a desktop card using GDDR5, I can say it works pretty much as advertised; when tweaking the clock speeds on my graphics card, the memory speed is almost never the bottleneck.

    [​IMG]

    COMPARABLE DESKTOP PART

    Outside of the new G200 lineup that Nvidia has recently announced, mobile GPUs are always cherry-picked desktop GPUs. It's the exact same silicon with tweaked clock speeds. As a result, each mobile part has a desktop analogue that it can be compared to. Since reviews of mobile graphics are so rarefied (I try to do my share but it doesn't seem like enough of them ever pass through my hands), it can be helpful to be able to search for a desktop part and at least get a ballpark figure of how the mobile part you're looking at will run.

    DIRECTX 10.1 vs. PHYSX/CUDA

    One of the big differences between ATI and Nvidia right now are the technologies they're pushing to compete with one another. ATI has been the only vendor up until this point (the point of Nvidia's announced G200 parts) that produces DirectX 10.1 (as introduced in Windows Vista SP1) compatible parts. DirectX 10.1 support has been fairly rarefied, with the most notable introduction so far having been Ubisoft's Assassin's Creed. If you're looking to buy that particular game, do not buy the Steam version. Instead, buy a retail copy and do NOT patch it. Ubisoft removed DirectX 10.1 support they claimed was buggy (it wasn't) in a patch, and with that support in place, ATI cards have a massive performance advantage against the competition. Outside of this instance, DirectX 10.1 hasn't been terribly relevant.

    But then again, neither has PhysX. On-chip PhysX is only usable on Nvidia's higher end parts, and can add additional detail to games that support it, like realistic cloth, breaking glass, etc. Unfortunately, like DirectX 10.1, PhysX hasn't proved remarkably compelling either, with the only notable title using PhysX hardware acceleration being the game Mirror's Edge.

    Alongside PhysX in the Nvidia corner is CUDA, which is Nvidia's general purpose GPU computing platform. CUDA is seeing decent support and may be of interest to videophiles, where GPU-accelerated video encoding can produce healthy performance gains in CUDA-enabled applications. That said, I edit video on my desktop and have yet to have seen a need for a CUDA-enabled application. More than that, CUDA's shelf life may not be that long with the OpenCL standard beginning to surface. OpenCL is similar to CUDA, except that it's platform-independent. I can't imagine developers playing the vendor lock-in game and only using CUDA when OpenCL (and even Microsoft's upcoming DirectX 11 Compute Shaders) can run on either company's GPUs.

    These are things to be aware of, but they shouldn't affect your decision.

    MOBILE DRIVERS

    This, on the other hand, probably should impact your decision process. As much as I have a stated preference for ATI's hardware, they're woefully behind on the front of providing unified mobile drivers. The Nvidia user is going to be able to update his or her video drivers with new releases (meaning new fixes and performance improvements) just by visiting Nvidia's site and downloading new ones. ATI users aren't so fortunate; if they want to update their drivers they have to either rely on the notebook manufacturer to update (good luck with that) or use third party software to modify desktop drivers (a chore).

    I don't have too much of a problem doing the latter, but it can be a real headache for the more novice users, and for that reason I would tend toward recommending Nvidia's mobile hardware for the time being until ATI can pick up the slack and make mobile drivers available on the ATI website.

    [​IMG]

    CROSSFIRE AND SLI

    Both ATI and Nvidia have multi-GPU solutions for notebooks that will, with two exceptions, only appear in massive desktop replacement units. ATI's is called Crossfire; Nvidia's is called SLI. Please note that these solutions typically don't bring a linear performance improvement; two GeForce GTX 280Ms aren't going to run twice as fast as one, as latency and driver optimization come into play. With this technology, the aforementioned driver situation becomes more important ... because if a game isn't properly profiled by the vendor in question the game won't reap the benefits of SLI or Crossfire.

    Now, those two exceptions: ATI and Nvidia both have integrated graphics parts that, when combined with a low-end discrete part, can be used in Crossfire/SLI and thus improve performance substantially. These solutions are still nowhere near as good as mid-range and higher options, but they're also economical and good for battery life. Nvidia's solution with the GeForce 9400M, in particular, can also swap between a mid or high-end discrete part to the IGP when the notebook is running on the battery, resulting in substantial power savings.

    A BRIEF NOTE ON INTEL

    Planning to play cutting edge 3D games? Excellent! Don't buy anything using Intel graphics. Intel's integrated graphics performance is historically poor and rife with compatibility issues. When you're looking between Intel parts, you're really only dealing with levels of unplayability.

    A BRIEF NOTE ON VIDEO ACCELERATION

    In addition to being miserable for gaming, Intel's parts outside of the 4500MHD are also the only ones in the lineup (excepting ATI's Radeon X1200 integrated graphics series) that don't support high definition video decoding and acceleration. All other parts are designed to offload high definition video playback from the main processor to the GPU.

    [​IMG]

    COMPLAINT DEPARTMENT

    Finally, I'd just like to thumb my nose at all three graphics vendors (ATI, Nvidia, and Intel) for their complete lack of consumer-oriented business practices. ATI's mobile graphics driver situation is a nightmare; miles behind Nvidia's driver support. ATI's marketing department isn't doing them any favors either; Nvidia routinely works with game developers to make sure games run well on their hardware, and their “The Way It's Meant to be Played” program is everywhere. Whether Nvidia pays developers to cripple games or not (see the Assassin's Creed controversy), ATI's not out there hustling.

    Intel's driver situation is, if at all possible, substantially worse than ATI's. I'm fairly certain their graphics driver team is either one over-caffeinated teenager in a basement somewhere, or a bunch of programmers that weren't good enough to code for Creative (at least two readers should laugh at this one). Intel has basic compatibility issues with games, and they've made promises about basic performance in their hardware that they have failed to keep. Marketing lies, but Intel's integrated graphics are still essentially broken as far as I'm concerned.

    Finally, whomever is responsible for Nvidia's mobile graphics branding needs to suffer at the hands of angry consumers ... or just be fired. It's not bad enough that the market is over-saturated with mobile parts that are essentially the same but named differently, but the brands of their mobile parts almost never line up with their desktop ones. The most egregious offenders are the GTX 280M and 260M, which are actually just G92 silicon -- in other words, these are not mobile versions of the desktop GTX 280M and 260M, which are worlds more powerful.

    CONCLUSION OF PART I[/p]

    At this point we've covered most of the basics in terms of mobile GPUs. In part two, I'll cover the individual hardware and talk about the GPUs, what performance class they should find themselves in, and what your bare minimum for gaming ought to be. Stay tuned.

     
    Last edited by a moderator: Feb 6, 2015
  2. REMF

    REMF Notebook Consultant

    Reputations:
    0
    Messages:
    282
    Likes Received:
    0
    Trophy Points:
    30
    RE: Tigris/880G/785G

    amd are going to look like a right bunch of Burks if they release a 40 shader chip on 55nm now that nVidia's Ion2 IGP will come with twice the shaders of its predecessor and fabbed at 40nm.

    they owned the IGP market for ages after the 780G was released, seems strange that they are willing to throw that away.......

    if the Tigris integrated GPU genuinely was a 4xxx series GPU with 80 shaders then they could have ruled the low-end/light-gaming crowd, especially with the new 45nm PII derived CPU. they baffle me sometimes.
     
  3. Clutch

    Clutch cute and cuddly boys

    Reputations:
    1,053
    Messages:
    2,468
    Likes Received:
    28
    Trophy Points:
    66
    Could you do a little thing about ATI and NVidias CAD gpus?
     
  4. azu

    azu Notebook Enthusiast

    Reputations:
    0
    Messages:
    35
    Likes Received:
    0
    Trophy Points:
    15
    data isn't exactly widely available for the CAD Gpus and they are usually also horrendously expensive so its somewhat reasonable to understand they might not be covered. just letting you know.
     
  5. dubhagat

    dubhagat Notebook Consultant

    Reputations:
    8
    Messages:
    150
    Likes Received:
    0
    Trophy Points:
    30
    How do you rate 4500MHD if it is not used for gaming and just for occasional HD movie ?
     
  6. Clutch

    Clutch cute and cuddly boys

    Reputations:
    1,053
    Messages:
    2,468
    Likes Received:
    28
    Trophy Points:
    66
    I know but there are people who come to this forum who need that kinda thing.
     
  7. AdiQue

    AdiQue Notebook Enthusiast

    Reputations:
    66
    Messages:
    32
    Likes Received:
    0
    Trophy Points:
    15
    Hi,

    Thanks to your post I realized, that a 'mobile graphics' reference in a post's title will ultimately spur my interest, and at the very least make me think twice about resisting to open it. A lesser geeky thing that is...

    I for one almost did. Had I not wasted several precious hours of my life striving to make those supposedly 'industry standard' sound cards to actually output sound rather than provoking me to output insults and invectives... I would find your remark funny. Sardonic; that's what it is to me.

    Here ^^ we come to the point where you have warranted rep plus one for you plus your dog. No, neither for devising this mini article, nor for your editorial efforts to put it all together. For that little moment of truth in which sir, you outdid Chaz. Yes, I'm being serious. As far as I remember, he never stated openly his being inclined towards ATI in any of the many [graphics-]relevant press release comments/news bits posts.

    Take care,
    AdiQue

    PS Kudos for Chaz [yes, in advance] for not hating on me! ;]
     
    Last edited by a moderator: Jan 29, 2015
  8. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    I love how you say that "YOU CAN'T UPGRADE YOUR NOTEBOOK GRAPHICS" but you have a picture of an upgradeable MXM graphics card right in your article!

    Besides, with more companies buying into the MXM form factor (Acer now the biggest supporter), that tired old cliche deserves to be retired, or at least modified.

    UNLESS YOU ALREADY HAVE A DEDICATED GRAPHICS CARD AND OWN THE RIGHT BRAND OF LAPTOP, NO, YOU CANNOT UPGRADE YOUR GRAPHICS CARD.
    That should still shut up the people asking if they can upgrade their Intel GMAs, and the question of "what is the right brand of laptop?" can be directed here.

    Otherwise, great article.
     
  9. Jerry Jackson

    Jerry Jackson Administrator NBR Reviewer

    Reputations:
    3,075
    Messages:
    2,021
    Likes Received:
    34
    Trophy Points:
    66
    The inclusion of that photo was my call and intended to be as much of a joke as anything else. The "reality" is that even with notebooks that use MXM cards you still can't upgrade your graphics ... not "really" anyway.

    I mean, how many LEGITIMATE stores do you see selling new MXM cards so that you can swap your old MXM card out for a new card? Random ebay sellers don't count. Even the handful of notebook manufacturers that sold/sell notebooks with MXM cards don't provide an option for you to buy a newer MXM card from them and upgrade your existing notebook.

    Bottom line, although the technology exists there isn't a practical solution for upgrading notebook graphics.
     
  10. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    MXM-Upgrade.com and MXMVideoQuest.com are the only two quasi-legitimate stores I can think of.
    And yes, it's unfortunate that MXM has become more of a boon to ODM/OEMs than to the end users, but that's the way things go. Especially considering that even the manufactuers that use MXM would void your warranty faster than you could say "I upgraded my graphics card..."

    I guess for a blanket statement it still works, but those of us on NBR-Acer reserve the right to upgrade our MXM cards as we please. :)
     
  11. pixelot

    pixelot Notebook Acolyte

    Reputations:
    3,732
    Messages:
    6,833
    Likes Received:
    0
    Trophy Points:
    205
    Yep, they've all got problems. At least I have a somewhat straightforward Nvidia GPU in my laptop, and a nice ATI card in my desktop. Intel just deserves to be dragged out and shot. ;) :D
     
  12. Johnny T

    Johnny T Notebook Nobel Laureate

    Reputations:
    6,092
    Messages:
    12,975
    Likes Received:
    201
    Trophy Points:
    481
    Considering there are already a perfectly good guide/article on them (always updated), I think the article should just link it. Instead of writing something new that might not be as good.

    -> Here!<-

    Actually, alot of people come here to ask for notebooks with these GPUs. And they are not horrendously expensive. A workstation notebook with ISV certified GPU starts at around $1200 nowadays.
     
  13. allfiredup

    allfiredup Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,209
    Likes Received:
    17
    Trophy Points:
    106
    The powers-that-be at nVIDIA need to be flogged for their nightmare of a model naming scheme!

    They also deserve a few extra whacks for creating models that are almost identical. Example- the 9200M GS and 9300M GS- the only difference is that one has a 1300MHz Shader Speed and the other 1400MHz. They perform almost identically- comparing same memory size/type of each model.

    I also think that making verions a particular model in both DDR2 and DDR3/GDDR3 is very frustrating! The performance differences are substantial, as noted above, but even most savvy consumers aren't sure which 'version' they're getting! ATI and nVIDIA are both guilty of this.

    For example, the ATI HD 4570 is available with either DDR2 or GDDR3 memory. The DDR2 memory speed is clocked at 500MHz, while the GDDR3 memory speed is 680MHz. Comparing 512mb versions of each, the difference in 3DMark06 performance is well over 1,000 points! Why not just call the DDR2 version the 4550 and the GDDR3 version the 4570???

    Another example- the Dell Latitude E6400 and E6500 both have the nVIDIA Quadro NVS 160M, but the E6400 has the DDR2 version (400MHz memory speed) and the E6500 has the GDDR3 version (700MHz memory speed). There is no mention of this on Dell's website, though. The difference in performance is almost 600 points on 3DMark06.

    And my final complaint, for the moment, is when the same GPU model is offered on the same model with different memory amounts. Case in point- the Dell Studio 1555- it can be ordered with the 256mb or 512mb version of the ATI HD 4570 (both GDDR3). Performance between the two are virtually identical in most tests.
     
  14. shoelace_510

    shoelace_510 8700M GT inside... ^-^;

    Reputations:
    276
    Messages:
    1,525
    Likes Received:
    0
    Trophy Points:
    55
    I agree... That modified sentence would fit much better, and be more accurate. ;) hehe
     
  15. peli_kan

    peli_kan Notebook Evangelist

    Reputations:
    228
    Messages:
    498
    Likes Received:
    0
    Trophy Points:
    30
    Thirded. 10ch
     
  16. Apollo13

    Apollo13 100% 16:10 Screens

    Reputations:
    1,432
    Messages:
    2,578
    Likes Received:
    210
    Trophy Points:
    81
    I'm sure a lot of people would like to see someone at nVIDIA get flogged for their handling of the overheating GPU fiasco, too. Certainly something worthy of a mention in the complaint department - even if it is fixed now, it's a terrible business practice to completely deny such a widespread mistake like that, and something they may do again.

    Then there's the nVIDIA driver BSOD problems. Mainly a problem soon after Vista's launch, but I'm sure there's still people who's like to see some floggings over that. I wouldn't mind it myself right now, having gotten two nvdisp.dll BSOD's in the past three hours, at least one of which was caused by an infinite loop - and that's on XP. Fortunately it's the most BSOD's from nVIDIA I've got in 3 hours in a very long time - possibly forever - but it's still a tad irksome, especially given that these particular drivers had only BSOD'ed once in 6 months before (and I wasn't even running the same programs the two times it did tonight).

    Will be looking for part two. Off for now, as my battery's about to die!
     
  17. JabbadaGriffin

    JabbadaGriffin Notebook Consultant

    Reputations:
    6
    Messages:
    168
    Likes Received:
    0
    Trophy Points:
    30
    They need to fix their laptop card naming, yet they're fixing their perfectly understandable desktop CPU names. sad lol

    Good article, and I can't wait for Part 2! :D
     
  18. amihalceanu

    amihalceanu Notebook Consultant

    Reputations:
    8
    Messages:
    141
    Likes Received:
    0
    Trophy Points:
    30
    I'm pretty sure this is not true. In fact 9300 has 16 shaders and 9200 has just 8, the performance difference between the two is large. Other than that I completely agree with everything that has been said.
    Very nice article, finally some clarifications. Thank you and can't wait for part 2.
     
  19. tianxia

    tianxia kitty!!!

    Reputations:
    1,212
    Messages:
    2,612
    Likes Received:
    0
    Trophy Points:
    55
    according to nvidia, 9200m gs and 9300m gs both have 8 cores, 9300m g has 16 cores.
     
  20. allfiredup

    allfiredup Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,209
    Likes Received:
    17
    Trophy Points:
    106
    The 9200M GS and 9300M GS both have 8 shaders. The 9300M G has 16 shaders. Only a handful of models have used the 9300M G.

    Core/Shader/Memory Speeds
    9200M GS- 550/1300/700
    9300M GS- 550/1400/700
    9300M G- 400/800/600

    I found a few examples over at LAPTOP Magazine to compare the GS models-

    Samsung X460 (9200M GS) - 3DMark06- 2082
    Lenovo ThinkPad SL400 (9300M GS) - 3DMark06- 2191

    I also found a review of the ASUS A6S with the 9300M G- 3DMark06- 1665
     
  21. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    And just to confuse things more, the 9300M G is the rebranded 8400M GS. Remember THAT fiasco from when the 9 Series was just starting out?
     
  22. Kaldor

    Kaldor Notebook Enthusiast

    Reputations:
    0
    Messages:
    11
    Likes Received:
    0
    Trophy Points:
    5
    I own a Sager 5793 with a 9800 GT video card in it. I cannot upgrade the video card on it because the cards used in the 5797 use the same interface but are different voltage wise from what I understand. But if the card fries, I can get a new one and simply plug in to replace at the cost of $400+. :(

    As far graphics go the article is spot on. Avoid Intel like the plague. However if your using a machine with onboard graphics dont expect to play anything 3d heavy like a FPS with any quality. Stick to a high end Nvidia or ATI part if you want to do that. It will cost more, but its worth it if you use the machine for gaming.

    For the guy asking about CAD. You can get a Nvidia Quadro in some laptops, most notably a Sager. Im not really familar with CAD and the hardware for it, but this may be an answer.

    Unfortunately the interface for mobile graphics, MXM, is there, but it is by no means a standard as things change all the time from model to model and manufacturer to manufacturer.

    As far as manufacturers go, Nvidia is simply die shrinking and renaming. Their 2XX mobile chips are nothing but a die shrunk 88XX mobile chip from a couple of years ago. The clock them up slightly and give them a rename and voila, new mobile chip. Their current desktop chip, the 2XX, simply use too much power and have too big a die for a mobile chip. ATI has pushed to 40nm and is actually making new chips but their driver support blows.

    Good article, I look forward to part 2.
     
  23. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    Um, Kaldor? The newest of the new 200M series will be 40nm GT200 cores.
     
  24. Azone

    Azone Notebook Evangelist

    Reputations:
    42
    Messages:
    325
    Likes Received:
    3
    Trophy Points:
    31
    Great article, well done. :) Got a good laugh from the MXM card picture. Anyhow, I agree with the naming scheme. I used to understand all of ATI's and nVidia's lineup. Now the problem for me is the rebadging. I mean, a 9800m GT is an 8800m GTX, and 9500m GS is an 8600m GT, a 9300m G is an 8400m GS...seriously, what is the point of all this renaming. I understand most of the cards, but often I'll have to look up specs, something I didn't have to do before.
     
  25. rootheday

    rootheday Newbie

    Reputations:
    0
    Messages:
    9
    Likes Received:
    0
    Trophy Points:
    5
    The bashing on Intel's graphics is very trendy. A few points to ponder:
    1. Has anyone here noticed the battery life on Intel CULV laptops? Notice that AMD Neo doesn't come close? Wonder if there is a connection...
    2. The phrasing on decode acceleration is accurate, if somewhat confusingly worded. Of course, all new laptops using Intel graphics do, in fact, ship with the Intel GMA4500. And the video quality on the 4500 is equal or better than competitive integrated and discrete parts. Moreover decode acceleration is only really needed for HD, BluRay - the cpu can handle decode of standard def content just fine (e.g DVDs).
    3. Intel's graphics are paired with discrete graphics in several vendor's laptops in a switchable configuration - including my T400, Sony has a model - more coming this year to allow you the best of both worlds - Intel graphics for battery life, discrete for gaming with no reboot required.
    4. While historically Intel's gaming graphics were riddled with compatibility issues and lousy performance, the story is getting better. For example, the G45 meets the minimum spec for Ghostbusters - must have met the publisher's playability targets...
     
  26. amihalceanu

    amihalceanu Notebook Consultant

    Reputations:
    8
    Messages:
    141
    Likes Received:
    0
    Trophy Points:
    30
    I don't want to contradict you AND nvidia, but I think the person that writes the nvidia gpu page has nothing to do with computer architecture.
    I have an asus n10j with nvidia 9300 G S . It has 16 cores.
    Check this link also for proof.
    http://forum.notebookreview.com/showthread.php?t=307767

    I am alot more inclined to believe gpu-z then nvidia's site. Even the frequencies they give are far off from production cards, and you can see that even by comparing the scores you posted to the N10J card. At any rate my point was their naming scheme is causing so much confusion it's hard even for an enthusiast to know what he is buying. ATI is a little better but not alot.
     
  27. allfiredup

    allfiredup Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,209
    Likes Received:
    17
    Trophy Points:
    106
    I don't doubt that nVIDIA's site could be wrong...who the heck knows!?

    But I'm even more confused now- my Dell Latitude E6400 has the nVIDIA Quadro NVS 160M graphics card, which is a GeForce 9300M GS with drivers optimized for business apps. According to both nVIDIA and GPU-Z, it only has 8 shaders...
     
  28. amihalceanu

    amihalceanu Notebook Consultant

    Reputations:
    8
    Messages:
    141
    Likes Received:
    0
    Trophy Points:
    30
    Also here is a print out from everest regarding the new 105m:
    Video Adapter nVIDIA GeForce G 105M
    GPU Code Name G98M
    PCI Device 10DE-06EC / 1025-0167 (Rev A2)
    Process Technology 65 nm
    Bus Type PCI Express 2.0 x16 @ x1
    Memory Size 512 MB
    GPU Clock (Geometric Domain) 182 MHz
    GPU Clock (Shader Domain) 364 MHz
    RAMDAC Clock 400 MHz
    Pixel Pipelines 4
    TMU Per Pipeline 1
    Unified Shaders 16 (v4.0)
    DirectX Hardware Support DirectX v10
    Pixel Fillrate 728 MPixel/s
    Texel Fillrate 1456 MTexel/s

    As you can see it reports 16 shaders too. Maybe GPU-z could be wrong but everest is reporting the same thing. This card is supposed to be just an overclocked 9300GS and nvidia's site lists it as an 8-core card. I do believe it is an overclocked 9300GS but I also think whoever writes on nividia specs sheet should be given an ion netbook to play crysis on all day long as punishment for screwing product naming and specifications like that. Maybe they imagine nobody cares what they sell as long as the name is up to date.
    I'm sad to say that between gpu-z, everest and nvidia's site I would go with the former 2 applications 99% of the time. So there you go, more proof for confusion :)
     
  29. allfiredup

    allfiredup Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,209
    Likes Received:
    17
    Trophy Points:
    106
    A final footnote to my previous rants-

    I thought that nVIDIA would at least NOT use model numbers in their new naming scheme that duplicate existing Quadro models! In particular, the GeForce GTS 150M and 160M...and the current Quadro NVS models are, surprise, the 150M and 160M!

    It's kinda like how a Dell Inspiron 15 can be a 1525 or 1545, or the Studio 15 is a 1535, 1537 or 1555....

    As they say, "common sense isn't very common"!
     
  30. Jayayess1190

    Jayayess1190 Waiting on Intel Cannonlake

    Reputations:
    4,009
    Messages:
    6,712
    Likes Received:
    54
    Trophy Points:
    216
    Perfectly fine for doing that.
     
  31. amihalceanu

    amihalceanu Notebook Consultant

    Reputations:
    8
    Messages:
    141
    Likes Received:
    0
    Trophy Points:
    30
    Completely agree. IMO creating this kind of confusion will only hurt them in the long run. This type of tricks alienates enthusiasts who ultimately make alot of buying decisions by recommending to less knowledgeable users. And with their 8400/8600 defects fiasco they drive away consumers and corporate customers. It's such a shame, I used to like nvidia, and their products are still very good but their marketing is almost worse than apple's.
     
  32. Kaldor

    Kaldor Notebook Enthusiast

    Reputations:
    0
    Messages:
    11
    Likes Received:
    0
    Trophy Points:
    5
    Proof please? The 40nm Direct X 10.1 chip is a low dollar desktop part from all Ive seen. It is another die shrunk core. The 55nm core from the 2XX cards like the 260 216 and 275 are simply too large and power hungry to be used on a laptop. Nvidia has gotten some really good mileage out the G80, G92 and G92b cores.

    1. Yup, but lower power = lower performance for the most part. Its a balancing act. The reason other platforms use more power is that they stomp Intels G45 into the ground on anything other basic desktop apps.
    2. Yeah, you can watch DVDs on Intel graphics, but it falls flat on it face with anything else.
    3. Who is making the the discrete card that Intel is going to allow on their closed platform? Are they using a MXM style card? Link please.
    4. As far as minimum spec goes. Who games at minimum spec?

    Intel graphics are fine if you never do anything other than work on the desktop in Windows. The cant do HD content, and they cant game other than at minimum settings and that is probably a stretch.

    Pretty good read on the current state of performance graphics for laptops:
    http://www.tomshardware.com/reviews/geforce-gtx-280m,2353.html
     
  33. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    This was the article I was thinking of - what is PCPerspective's reliability of info, anyway?

     
  34. allfiredup

    allfiredup Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,209
    Likes Received:
    17
    Trophy Points:
    106
    What do you mean when you say they can't "do" HD content?

    Intel® Graphics Media Accelerator X4500HD (Intel® GMA X4500HD), includes built-in support for full 1080p high-definition video playback, including Blu-ray* disc movies. This powerful video engine provides users with a rich, new media experience to deliver smooth HD playback without the need for add-in video cards or decoders.
     
  35. Kaldor

    Kaldor Notebook Enthusiast

    Reputations:
    0
    Messages:
    11
    Likes Received:
    0
    Trophy Points:
    5
    Good article overall and raises some good questions. The 2XX 40nm chip they are releasing is nothing like the desktop 2XX chip if its 40nm, using GDDR5 and DX 10.1. I guess time will tell on actual performance numbers. From everything Ive seen ATI mobile 4850 is the fastest out there atm as the current 280m is pretty much fail as its nothing but a re-name of the 9XXX mobile part. Its about time that they actually made something new.

    Another interesting point is the chart at the top which I noticed before the author pointed it out in the article. The jump in power consumption when going from the 240m (55nm part, GDDR3) to the 250m/260m (40nm, GDDR5). I dont think TSMC has the 40nm quite right right, in spite of what they say, and they are experiencing leakage like the ATI 4770 desktop part. Couple this with GDDR5 which uses more voltage, the setup uses more power and thus runs hotter which is not good in a mobile device.

    I still stand by that it most likely an offshoot of the low buck 40nm, DX 10.1 chip. Most likely very highly binned chip that is set aside for use in mobile devices and the others pushed toward desktop OEMs. Nvidia has been less than honest before. Whats to stop them now?
     
  36. Kaldor

    Kaldor Notebook Enthusiast

    Reputations:
    0
    Messages:
    11
    Likes Received:
    0
    Trophy Points:
    5
    Not to flame, but did you copy that direct from Intels website? Also, whats the asterisk for? Long and short is that nothing Intel has can hold a candle to what AMD and Nvidia can do both in HD playback and gaming. Its been proven time and time again in countless reviews.
     
  37. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    4500MHD - HD playback if you've got everything set up just right.
    GMA 900/950/X3100 - lol no.
     
  38. Jayayess1190

    Jayayess1190 Waiting on Intel Cannonlake

    Reputations:
    4,009
    Messages:
    6,712
    Likes Received:
    54
    Trophy Points:
    216
    GMA 950 plays 720p with no problems at all.
     
  39. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    That's not the GMA 950, that's your CPU. That's why all these netbooks with the Intel Atom and GMA 950 struggle to play HD content. We're talking about video acceleration using the GPU to assist your CPU.
     
  40. Raeglatem

    Raeglatem Notebook Consultant

    Reputations:
    0
    Messages:
    189
    Likes Received:
    0
    Trophy Points:
    30
    Haha I laughed at the Creative joke!
     
  41. IMNOTDRPHIL

    IMNOTDRPHIL Notebook Enthusiast

    Reputations:
    69
    Messages:
    42
    Likes Received:
    1
    Trophy Points:
    15
    My Latitude E5400 has the GM45 (X4500 MHD) unit and it works well for HD movie playback as it can decode MPEG-2, MPEG-4, and a few other formats I don't use like VC-1. Intel's drivers are spotty no matter what OS you use, though- the Creative joke was spot-on. The hardware is fine when the drivers actually work, but you have to be very judicious in upgrading the drivers for the Intel IGPs.
     
  42. IMNOTDRPHIL

    IMNOTDRPHIL Notebook Enthusiast

    Reputations:
    69
    Messages:
    42
    Likes Received:
    1
    Trophy Points:
    15
    AMD's Athlon Neo has a considerably higher TDP- that could be part. Plus AMD notebook CPUs are all 65 nm while Intel's current ULV chips are 45 nm, so idle power should be a lot lower with the Intel CPUs.

    The X4500 MHD does have good video quality, but at least in my opinion, the video quality from my desktop's Radeon HD 3850 is better. The proprietary AMD GPU drivers (fglrx), yield a poorer video quality than the Intel drivers as there is a lot more tearing, but the open-source ATi GPU drivers have IMHO the best video playback quality I've ever seen. They are tear-free and the picture quality is stunning, while the Intel drivers give good picture quality but with some occasional tearing.

    The amount of GPU decode assist needed to play back a video depends on a number of variables- CPU speed, bitrate, and codec used. My 2.0 GHz C2D T7250 can easily play back a 1080i MPEG-2 at 19 Mbps without the GPU helping at all (XVideo). However, the CPU meets its match when playing 720p H.264 video. I can't play 1080-line H.264 or MPEG-4 on this machine until the decode assist for that makes it into a stable driver.

    The Intel unified-shader units (X3000, X3500, X4500) are considerably different from the old GMA units that didn't even have hardware vertex shaders or support OpenGL later than 1.4. The GM45 isn't a great gaming unit, but one can at least play games at low resolutions on the unit, unlike many of the older GMA units. I used to have a machine with the 945GM and it wasn't even possible to play very many games at 800x600, while the GM45 can play at 1024x768 or sometimes even the unit's native 1440x900 with simpler games and still hit 30 fps.
     
  43. rootheday

    rootheday Newbie

    Reputations:
    0
    Messages:
    9
    Likes Received:
    0
    Trophy Points:
    5