The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    How good will the integrated X3100 GPU eventually become?

    Discussion in 'Lenovo' started by clyde1, Sep 7, 2007.

  1. clyde1

    clyde1 Notebook Consultant

    Reputations:
    2
    Messages:
    138
    Likes Received:
    0
    Trophy Points:
    30
    Who has a crystal ball and can predict how good the integrated X3100 GPU solution will eventually become once the drivers are optimized? Will it become say 30% better than it is now?

    Will it ever approach the performance of an entry level dedicated GPU such as the ATI X1400 or Nvidia Quadro NVS 140M with 128MB?

    For normal non-gaming use, is the X3100 a bottleneck in any way with regards to performance (speed) with Vista?

    And how much of a benefit to battery life does the X3100 offer compared to the above dedicated GPUs?



    I'm trying to configure a system for my father-in-law, who wants a notebook alternative to his old desktop.

    I'm thinking about either the R61 14" WXGA or T61 15.4" (WSXGA or WSXGA+ ???), with the integrated X3100.

    He won't need gaming or heavy video editing capability, but would probably like the idea that moderate video editing can be done without a problem.

    I see a lot of comments about how much better battery life is with integrated GPU, but some of the battery life reports I've heard with the R61 with integrated X3100 didn't sound that great to me.

    He'll likely be plugged in most of the time anyway, so battery life is probably not a big consideration for him, but I'm just curious about it for myself. But price certainly matters and X3100 will be cheaper than a dedicated GPU.
    Thanks!
     
  2. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    The X3100 will probably become a very powerful card, as integrated cards go. It will approach and probably equal entry level solutions like the x300 or the Go7200 (BTW, the x1400 is a mid-level, and the NVS140m is practically a performance card, not entry-level).

    For video-editing, an x3100 won't be any worse than an NVS140m.

    For battery life, I believe the x3100 adds something like 10% at idle over the NVS140m, though I'm not 100% sure on that number, I'd have to check again to be certain.
     
  3. JM

    JM Mr. Misanthrope NBR Reviewer

    Reputations:
    4,370
    Messages:
    2,182
    Likes Received:
    8
    Trophy Points:
    56
    The X3100 does have a lot of potential. It's not going to be as great as most cards, but like Odin said, it will be up there with the older entry level dedicated cards.

    We just need the drivers to be able to unlock its power.

    The power savings is the major benefit of going integrated. While its not a huge boost, your system will run cooler and have some more battery life overall, compared to a system with dedicated graphics.
     
  4. braddd

    braddd Notebook Deity

    Reputations:
    44
    Messages:
    834
    Likes Received:
    0
    Trophy Points:
    30
    I wouldn't be surprised if the x3100 added that much battery life. I have the 140M and I find it a struggle to get about 3hrs on a 6 cell.
     
  5. clyde1

    clyde1 Notebook Consultant

    Reputations:
    2
    Messages:
    138
    Likes Received:
    0
    Trophy Points:
    30
    Ok, thanks. Guess it depends on the need, but in general if a dedicated graphics card only reduces the battery life by about 10%, then the only thing that would stop me from getting a decent card would be the price. (unless you're absolutely certain you won't be doing anything that the X3100 couldn't handle).

    What is it that makes the integrated solution inherently more efficient? I assume the integrated solution takes advantage of other capabilities on the motherboard / processor that the dedicated card has to duplicate? Plus the extra RAM???

    I need to find the thread where I thought someone with R61 14" with X3100 could only get about 3 hours. I think my T60 15.4" with ATI X1400 could get to 3 hours (had about 25% left at 2.5 hours, moderate surfing, wifi on).
     
  6. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    Those could very well be with different battery capacities as well.

    Essentially, yes. Also because a dedicated graphics card is inherently more powerful, even at idle, which increases battery drain.
     
  7. JCMS

    JCMS Notebook Prophet

    Reputations:
    455
    Messages:
    4,674
    Likes Received:
    0
    Trophy Points:
    105
    According to the specs, it *should* be able to run the newest games for some time. There is 10 USPU after all, and the 6600GT had 8 pixel and 3 vertex so it's not that far spec wise (althout I doubt it will meet the horsepower)
     
  8. epictrance4life

    epictrance4life Notebook Geek

    Reputations:
    107
    Messages:
    75
    Likes Received:
    0
    Trophy Points:
    15
    it's actually 8 USPU on the X3000/X3100 :)
     
  9. clyde1

    clyde1 Notebook Consultant

    Reputations:
    2
    Messages:
    138
    Likes Received:
    0
    Trophy Points:
    30
    But I thought (perhaps incorrectly) that the ATI X1400 or Nvidia Quadro NVS 140M with 128MB couldn't run the newest games very well (especially the ATI X1400). Does this mean the X3100 could turn out to be better than say the ATI X1400 and on par with the Nvidia 140M? Or am I reading too much into this. I haven't looked up what 8 USPU even means yet.
     
  10. zenpharaohs

    zenpharaohs Notebook Evangelist

    Reputations:
    15
    Messages:
    353
    Likes Received:
    0
    Trophy Points:
    30
    I didn't know it was supposed to be that fast. I didn't know the X3100 was that fast either. Maybe I should have gone with the X3100.

    Maybe there is a way to throttle back the NVS140M for power consumption, and up for performance? That would be the best of both worlds.
     
  11. jaxx1

    jaxx1 Notebook Geek

    Reputations:
    12
    Messages:
    84
    Likes Received:
    0
    Trophy Points:
    15
    While Intel has brought additional developers in-house to focus on graphics, you are at least a year out before the results of that become obvious. That said, most notebook companies won't release updated drivers after a year, so you'll likely revert to diver-modding, and once again, there's no guarantee you'll see the improved performance results.

    The X3100 has served it's purpose: Introduce the transition phase for Intel from a lost cause, to a possible contender.

    Next-gen Intel GPU's might be worth keeping an eye on, but then again, so wil those of Nvidia and Ati. DirectX 10.1 is just around the corner after all.
     
  12. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    The NVS140m is significantly behind current performance cards (8600M, HD2600), but it's about on par with last generation performance cards (x1600, Go7600). It has a good amount of power for moderate gaming.

    And throttling up and down is done automatically in Vista.
     
  13. noxxle99

    noxxle99 Notebook Deity

    Reputations:
    34
    Messages:
    922
    Likes Received:
    1
    Trophy Points:
    31
    I predict that the x3100 will forever hover around a 1000 in 3dmark05. I suspect that the drivers intel is working on are mainly to enhance compatibility with things such as pixel and vertex shaders. This doesn't necessarily mean better frame rates. In fact, if a game that gave users the option to select between pixel shaders 2.0 or 3.0, and this game worked with the x3100, it would probably run MUCH MUCH slower with 3.0 shaders enabled. Look at the low end geforce 8 cards. These cards support dx10 features but simply don't have the horse power to utilize them.

    It wouldn't surprise me one bit if newer drivers for the x3100 gradually reduce frame rate performance in games. Its simply a trade-off for better compatibility with the new dx10 features. And it makes sense for intel to focus on compatibility because most people don't buy this card for gaming.
     
  14. Enki

    Enki Notebook Geek

    Reputations:
    2
    Messages:
    96
    Likes Received:
    0
    Trophy Points:
    15
    The 140 will always be faster then the x3100

    x3100:
    core clock 500MHz
    shared memory
    8 unified shaders

    140: (8400M-GT based)
    core clock 450MHz
    dedicated memory
    16 stream proccessors

    Intel does much better at power usage because its much more integrated (lower number of chips that need power decreases power usage, hense why system on a chip designs are great for power usage) but also cause that is what they concentrate on. ATI and nVidia are performance first in their design while Intel has much more experience making low power chips.

    Jaxx1, there is something else to consider. When writing the drivers they may find a bottleneck in the graphics chipset that they can't do anything about but is an easy fix for the next version of the chipset.
     
  15. noxxle99

    noxxle99 Notebook Deity

    Reputations:
    34
    Messages:
    922
    Likes Received:
    1
    Trophy Points:
    31
    Someone respond to my post. I have no basis for what I said and I want to know if it is indeed the truth. :p
     
  16. vostro1400user

    vostro1400user Notebook Deity

    Reputations:
    202
    Messages:
    1,064
    Likes Received:
    0
    Trophy Points:
    55
    X3100 yields less heat therefore more reliable.
     
  17. Enki

    Enki Notebook Geek

    Reputations:
    2
    Messages:
    96
    Likes Received:
    0
    Trophy Points:
    15
    heat!=reliability
    Where do you get that?
     
  18. noxxle99

    noxxle99 Notebook Deity

    Reputations:
    34
    Messages:
    922
    Likes Received:
    1
    Trophy Points:
    31
    LOL LOL LOL
     
  19. vostro1400user

    vostro1400user Notebook Deity

    Reputations:
    202
    Messages:
    1,064
    Likes Received:
    0
    Trophy Points:
    55
    less heat=more reliable
     
  20. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    I'm pretty sure vosro1400 means that since the x3100 gives off more heat, the notebook's internals will be put under less stress and therefore last longer. This is a nice idea, but not necessarily true.
     
  21. noxxle99

    noxxle99 Notebook Deity

    Reputations:
    34
    Messages:
    922
    Likes Received:
    1
    Trophy Points:
    31
    He is using the term "reliable" ambiguously then. Some hardware is built to sustain higher temperature levels over time. His general statement is not true, like you stated.
     
  22. vostro1400user

    vostro1400user Notebook Deity

    Reputations:
    202
    Messages:
    1,064
    Likes Received:
    0
    Trophy Points:
    55
    well, you could argue that chips could be built to sustain higher temperature, but capacitors on the board are not.
     
  23. Enki

    Enki Notebook Geek

    Reputations:
    2
    Messages:
    96
    Likes Received:
    0
    Trophy Points:
    15
    How many laptops have you seen go bad from the capacitors overheating in comparison to other things going bad? (not including those using the bad capacitors from a few years ago) Plus how long will that take? If it take 8 years for the failure to happen do you really care cause the original HDD will be worn out by then and the laptop will be very outdated....

    Laptops do have cooling fans too.
     
  24. vostro1400user

    vostro1400user Notebook Deity

    Reputations:
    202
    Messages:
    1,064
    Likes Received:
    0
    Trophy Points:
    55
    you might not be able to identify which capacitor gets trouble, instead, you could only get blue screen, frozen progs...
     
  25. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    It's highly unlikely. For the most parts the components in such a system are thermally rated for the highest temperatures they might sustain (with high end processors and dedicated GPUs, for example). If a capacitor goes bad it's almost certainly because of a bad cooling system (malfunctioned fans, sloppy thermal paste, badly seated heatsinks) or because there's a flaw in the capacitor. These risks aren't really lessened with an integrated GPU to any significant degree.
     
  26. vostro1400user

    vostro1400user Notebook Deity

    Reputations:
    202
    Messages:
    1,064
    Likes Received:
    0
    Trophy Points:
    55
    no matter how reliable it claims for high temperature system, i prefer cooler one.
     
  27. M189

    M189 Newbie

    Reputations:
    0
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
    I have the integrated graphics card on my T61, the T7300 processor, and 1gb of RAM. Since the integrated graphics uses system memory, is it worth upgrading the RAM to get better graphics and gaming performance. Has anyone done this upgrade and seen significant positive results?
     
  28. wax4213

    wax4213 Notebook Consultant

    Reputations:
    61
    Messages:
    231
    Likes Received:
    0
    Trophy Points:
    30
    Upgrading RAM is never a bad thing, and depending on your OS (Vista definitely, XP runs fine on 1GB) then you'll see a decent performance increase going from 1GB to however much you decide. I chose to buy a 2GB stick of RAM for my D630 and replaced one of the two 512MB sticks that came with the D630 for a total of 2.5GB. It was easily worth the $98 + shipping for the RAM, and there's even a better deal in the RAM deals thread from Fry's, $95 shipped for a 2GB stick. If you've got the money, then I would go for it.

    Even though XP runs very well on 1GB, more RAM is used for your other programs, especially some games. It's still worth it to upgrade.
     
  29. Enki

    Enki Notebook Geek

    Reputations:
    2
    Messages:
    96
    Likes Received:
    0
    Trophy Points:
    15
    While I do agree with wax4213 there is something I should point out. The ram upgrade is an indirect speed increase meaning the laptop will be faster with more ram but not because the extra ram will effect the integrated graphics in a direct way. (i.e. it doesn't increase the memory bus speed)
     
  30. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    More RAM should affect the integrated graphics directly, since the more RAM you have the more it can allocate to the IGP and still have enough sytem RAM to run your apps quickly. It won't increase the power of the graphics chip, but it should increase overall graphics speed.
     
  31. Enki

    Enki Notebook Geek

    Reputations:
    2
    Messages:
    96
    Likes Received:
    0
    Trophy Points:
    15
    No, graphics memory is not like normal computer memory, there is no caching or paging which makes larger amounts of ram faster. So either all your frame buffers and textures fit in memory and everything works or they don't and it doesn't work.

    (Yes there is AGP texturing and the like with PCI-E but that doesn't really impact integrated graphics)
     
  32. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    Well, at least in dedicated chips it's been shown that larger amounts of system RAM allocated to the graphics card does indeed increase the speed of the graphics system. I'm fairly certain that the same holds true for IGP's (though to a smaller extent, as IGP's are less capable of taking advantage of large amounts of VRAM).
     
  33. noxxle99

    noxxle99 Notebook Deity

    Reputations:
    34
    Messages:
    922
    Likes Received:
    1
    Trophy Points:
    31
    For some reason I cannot install the video driver from Intel's website for my x61. I can only use the one available at lenovo. Anybody know why?
     
  34. Enki

    Enki Notebook Geek

    Reputations:
    2
    Messages:
    96
    Likes Received:
    0
    Trophy Points:
    15
    Only in games with very large textures has a larger memory graphics card done better then the same card with less memory because the one with less memory will then use AGP style texturing and have to use main memory. (with integrated graphics its all the same memory pool so obviously no agp style texturing slowdown is possible)

    You can't allocate memory to the graphics card for any other purpose.
     
  35. odin243

    odin243 Notebook Prophet

    Reputations:
    862
    Messages:
    6,223
    Likes Received:
    0
    Trophy Points:
    205
    I'm not talking about larger amount of dedicated memory, if you'll read my post you'll find that I was talking about "larger amounts of system RAM allocated to the graphics card", i.e. shared memory.
     
  36. Enki

    Enki Notebook Geek

    Reputations:
    2
    Messages:
    96
    Likes Received:
    0
    Trophy Points:
    15
    So if this memory is suposed to make the graphics faster what is the memory being used for?
     
  37. noxxle99

    noxxle99 Notebook Deity

    Reputations:
    34
    Messages:
    922
    Likes Received:
    1
    Trophy Points:
    31
    x3100 will forever suck. I just tried my first game (civ4) on my x61 and im getting about the SAME fps I got with my intel 915gm. About 15fps average, regardless of what the graphics are set to...
     
  38. epictrance4life

    epictrance4life Notebook Geek

    Reputations:
    107
    Messages:
    75
    Likes Received:
    0
    Trophy Points:
    15
    forever is a long time...i bet your running vista. the x3100 vista drivers are severely crippled compared with the x3100 XP drivers as XP has h/w t&l and vertex shader support while vista's still does not. I've emailed intel support about this and they said:
    We do not have an ETA on a future driver for Windows Vista* that will actually support transform and lighting. Currently, this is being developed.
     
  39. noxxle99

    noxxle99 Notebook Deity

    Reputations:
    34
    Messages:
    922
    Likes Received:
    1
    Trophy Points:
    31
    I've tried it in both Vista AND XP. Same results. I get the same 3dmark score in both operating environments too.

    How can Vista's drivers not support T&L and Vertex/shaders? I've just tried several games under Vista and I am getting the same frame rates as when i was using XP....

    *edit*
    the vista 15.6 driver is supposed to included support for those features right? it was released sept 2.
     
  40. epictrance4life

    epictrance4life Notebook Geek

    Reputations:
    107
    Messages:
    75
    Likes Received:
    0
    Trophy Points:
    15
    they were suppose to include those features, but intel seems to be having problems adding it, see my thread here regarding the issue