Who has a crystal ball and can predict how good the integrated X3100 GPU solution will eventually become once the drivers are optimized? Will it become say 30% better than it is now?
Will it ever approach the performance of an entry level dedicated GPU such as the ATI X1400 or Nvidia Quadro NVS 140M with 128MB?
For normal non-gaming use, is the X3100 a bottleneck in any way with regards to performance (speed) with Vista?
And how much of a benefit to battery life does the X3100 offer compared to the above dedicated GPUs?
I'm trying to configure a system for my father-in-law, who wants a notebook alternative to his old desktop.
I'm thinking about either the R61 14" WXGA or T61 15.4" (WSXGA or WSXGA+ ???), with the integrated X3100.
He won't need gaming or heavy video editing capability, but would probably like the idea that moderate video editing can be done without a problem.
I see a lot of comments about how much better battery life is with integrated GPU, but some of the battery life reports I've heard with the R61 with integrated X3100 didn't sound that great to me.
He'll likely be plugged in most of the time anyway, so battery life is probably not a big consideration for him, but I'm just curious about it for myself. But price certainly matters and X3100 will be cheaper than a dedicated GPU.
Thanks!
-
The X3100 will probably become a very powerful card, as integrated cards go. It will approach and probably equal entry level solutions like the x300 or the Go7200 (BTW, the x1400 is a mid-level, and the NVS140m is practically a performance card, not entry-level).
For video-editing, an x3100 won't be any worse than an NVS140m.
For battery life, I believe the x3100 adds something like 10% at idle over the NVS140m, though I'm not 100% sure on that number, I'd have to check again to be certain. -
The X3100 does have a lot of potential. It's not going to be as great as most cards, but like Odin said, it will be up there with the older entry level dedicated cards.
We just need the drivers to be able to unlock its power.
The power savings is the major benefit of going integrated. While its not a huge boost, your system will run cooler and have some more battery life overall, compared to a system with dedicated graphics. -
-
Ok, thanks. Guess it depends on the need, but in general if a dedicated graphics card only reduces the battery life by about 10%, then the only thing that would stop me from getting a decent card would be the price. (unless you're absolutely certain you won't be doing anything that the X3100 couldn't handle).
What is it that makes the integrated solution inherently more efficient? I assume the integrated solution takes advantage of other capabilities on the motherboard / processor that the dedicated card has to duplicate? Plus the extra RAM???
I need to find the thread where I thought someone with R61 14" with X3100 could only get about 3 hours. I think my T60 15.4" with ATI X1400 could get to 3 hours (had about 25% left at 2.5 hours, moderate surfing, wifi on). -
-
According to the specs, it *should* be able to run the newest games for some time. There is 10 USPU after all, and the 6600GT had 8 pixel and 3 vertex so it's not that far spec wise (althout I doubt it will meet the horsepower)
-
-
But I thought (perhaps incorrectly) that the ATI X1400 or Nvidia Quadro NVS 140M with 128MB couldn't run the newest games very well (especially the ATI X1400). Does this mean the X3100 could turn out to be better than say the ATI X1400 and on par with the Nvidia 140M? Or am I reading too much into this. I haven't looked up what 8 USPU even means yet.
-
Maybe there is a way to throttle back the NVS140M for power consumption, and up for performance? That would be the best of both worlds. -
While Intel has brought additional developers in-house to focus on graphics, you are at least a year out before the results of that become obvious. That said, most notebook companies won't release updated drivers after a year, so you'll likely revert to diver-modding, and once again, there's no guarantee you'll see the improved performance results.
The X3100 has served it's purpose: Introduce the transition phase for Intel from a lost cause, to a possible contender.
Next-gen Intel GPU's might be worth keeping an eye on, but then again, so wil those of Nvidia and Ati. DirectX 10.1 is just around the corner after all. -
And throttling up and down is done automatically in Vista. -
I predict that the x3100 will forever hover around a 1000 in 3dmark05. I suspect that the drivers intel is working on are mainly to enhance compatibility with things such as pixel and vertex shaders. This doesn't necessarily mean better frame rates. In fact, if a game that gave users the option to select between pixel shaders 2.0 or 3.0, and this game worked with the x3100, it would probably run MUCH MUCH slower with 3.0 shaders enabled. Look at the low end geforce 8 cards. These cards support dx10 features but simply don't have the horse power to utilize them.
It wouldn't surprise me one bit if newer drivers for the x3100 gradually reduce frame rate performance in games. Its simply a trade-off for better compatibility with the new dx10 features. And it makes sense for intel to focus on compatibility because most people don't buy this card for gaming. -
The 140 will always be faster then the x3100
x3100:
core clock 500MHz
shared memory
8 unified shaders
140: (8400M-GT based)
core clock 450MHz
dedicated memory
16 stream proccessors
Intel does much better at power usage because its much more integrated (lower number of chips that need power decreases power usage, hense why system on a chip designs are great for power usage) but also cause that is what they concentrate on. ATI and nVidia are performance first in their design while Intel has much more experience making low power chips.
Jaxx1, there is something else to consider. When writing the drivers they may find a bottleneck in the graphics chipset that they can't do anything about but is an easy fix for the next version of the chipset. -
Someone respond to my post. I have no basis for what I said and I want to know if it is indeed the truth.
-
X3100 yields less heat therefore more reliable.
-
heat!=reliability
Where do you get that? -
-
less heat=more reliable
-
I'm pretty sure vosro1400 means that since the x3100 gives off more heat, the notebook's internals will be put under less stress and therefore last longer. This is a nice idea, but not necessarily true.
-
-
well, you could argue that chips could be built to sustain higher temperature, but capacitors on the board are not.
-
How many laptops have you seen go bad from the capacitors overheating in comparison to other things going bad? (not including those using the bad capacitors from a few years ago) Plus how long will that take? If it take 8 years for the failure to happen do you really care cause the original HDD will be worn out by then and the laptop will be very outdated....
Laptops do have cooling fans too. -
you might not be able to identify which capacitor gets trouble, instead, you could only get blue screen, frozen progs...
-
It's highly unlikely. For the most parts the components in such a system are thermally rated for the highest temperatures they might sustain (with high end processors and dedicated GPUs, for example). If a capacitor goes bad it's almost certainly because of a bad cooling system (malfunctioned fans, sloppy thermal paste, badly seated heatsinks) or because there's a flaw in the capacitor. These risks aren't really lessened with an integrated GPU to any significant degree.
-
no matter how reliable it claims for high temperature system, i prefer cooler one.
-
I have the integrated graphics card on my T61, the T7300 processor, and 1gb of RAM. Since the integrated graphics uses system memory, is it worth upgrading the RAM to get better graphics and gaming performance. Has anyone done this upgrade and seen significant positive results?
-
Upgrading RAM is never a bad thing, and depending on your OS (Vista definitely, XP runs fine on 1GB) then you'll see a decent performance increase going from 1GB to however much you decide. I chose to buy a 2GB stick of RAM for my D630 and replaced one of the two 512MB sticks that came with the D630 for a total of 2.5GB. It was easily worth the $98 + shipping for the RAM, and there's even a better deal in the RAM deals thread from Fry's, $95 shipped for a 2GB stick. If you've got the money, then I would go for it.
Even though XP runs very well on 1GB, more RAM is used for your other programs, especially some games. It's still worth it to upgrade. -
While I do agree with wax4213 there is something I should point out. The ram upgrade is an indirect speed increase meaning the laptop will be faster with more ram but not because the extra ram will effect the integrated graphics in a direct way. (i.e. it doesn't increase the memory bus speed)
-
-
(Yes there is AGP texturing and the like with PCI-E but that doesn't really impact integrated graphics) -
-
For some reason I cannot install the video driver from Intel's website for my x61. I can only use the one available at lenovo. Anybody know why?
-
You can't allocate memory to the graphics card for any other purpose. -
-
-
x3100 will forever suck. I just tried my first game (civ4) on my x61 and im getting about the SAME fps I got with my intel 915gm. About 15fps average, regardless of what the graphics are set to...
-
We do not have an ETA on a future driver for Windows Vista* that will actually support transform and lighting. Currently, this is being developed. -
I've tried it in both Vista AND XP. Same results. I get the same 3dmark score in both operating environments too.
How can Vista's drivers not support T&L and Vertex/shaders? I've just tried several games under Vista and I am getting the same frame rates as when i was using XP....
*edit*
the vista 15.6 driver is supposed to included support for those features right? it was released sept 2. -
How good will the integrated X3100 GPU eventually become?
Discussion in 'Lenovo' started by clyde1, Sep 7, 2007.