Do any of the docks for X60 or X60s have DVI out? Also, are there any Thinkpads with DVI build-in?
-
I don't think IBMs have a DVI input. I have the X6 ultrabase and it only has a VGA. The only laptop company that I can think of at this moment, that has a DVI input built in, is Apple's Macbook Series.
-
I have an R60 and have been trying to figure out if I can get a DVI output also... I called IBM and they said that even if I had the dock that has the DVI output, it wouldn't work. This could be due to my X1300 GPU but in any event they said the port would not work... You might want to talk to them because it could be different for the X60.
The only other option that I've come across to get a DVI output is a video signal converter which are pretty expensive. -
The X60(s) docks don't have a DVI connector. You could get a DVI PC card but it costs more than $200.
-
Thats too bad, even Dells have one!
DVI is very useful, you can use the 12.1" on the go, and/or the 20+" HD LCD at home/office.
The Z series seem to have DVI connection, not sure how to config it though. Here is the description from Lenovo's main Z-series page:
DVI connector pass-through support (select models) -
You don't need a DVI connecter to use a 20" LCD. A VGA hook up will work perfectly fine.
-
So what exactly is the difference between DVI and VGA? One is digital the other analog?
-
-
Many manufactures skimp on the VGA connection if it has both which can explain it though.
Where have you read that it introduces extra latency?
You'd probably not game on your x60 anyway but that extra latency is nothing compared to the delay in your eyes or the delay in the rest on monitor. Shouln't be anyway.
And 1680x1050 is not a high resolution. I use it on my 20" widescreen LCD without any problems (and an x60s).
Even 1920x1080 (HD resolution) is higher than what DVI was originally supposed to handle. Dual-link DVI however might give VGA a match. But monitors requiring that doesn't even have VGA and the cheapest monitor requiring dual-link DVI is the apple/dell 30" monitors (if I haven't missed anything)... -
-
The size of the screen doesn't matter (more than that bigger screens more often have higher resolutions).
And it shouldn't reduce the performance either. The D-to-A convertion is done by a separate circuit and has nothing to do with the graphic-card itself.
Yes LCDs have ghosting, but it has nothing with VGA to do.
And the ghosting is way more prominent than the D-to-A convertion.
LCDs ghostin is measured in milliseconds, the convertion D-to-A convertion shouldn't take many microseconds. But I haven't got any numbers on that.
(but fact is that CRT monitors using VGA are much faster than LCDs using DVI).
The performance you are talking about are probably due to that you more often uses a higher resolution on your external monitor (1024x768 is a vey low resolution) and/or that you are having a dual-screen setup (laptop monitor + desktop monitor).
Most common today I think are ~17" CRTs that does atleast 1280x1024 (although people seldom realize that 1280x1024 doesn't have the right aspect ratio for an CRT).
But still, what the displays handle isn't relevant. VGA supports higher resolutions than single-link DVI and has been around for much longer.
VGA EASILY handles 1680x1050 in great quality.
The only LCD with VGA and DVI that I have are the dell 2007 WFP, and I can't see any difference between my ATi X850XT card on my desktop computer using DVI or my x60s using VGA - at 1680x1050.
And I used to have one 20" CRT and one 22" CRT, the latter supporting 2048x1536 (a resolution not even supported by single-link DVI) - although that was pushing the envelope for what my monitor can manage there are better monitors out there.
This FW900 for instance:
It supports a massive resolution of 2304x1440.
http://www.hardforum.com/showthread.php?t=952788
And that through VGA... Although probably with an quality-cable.
Googled a bit and found this:
http://www.gamepc.com/labs/view_content.asp?id=samsung910t&page=3
The monitor is old and I hope that that can explain the great differences - because a good monitor today (or yesterday) should certainly see such large differences, even a bad monitor shouldn't but unfortunately thats not the case.
So no, the differences between VGA and DVI is very small. And the largest differences depends on what components you have.
So in theory you shouldn't even have a need for DVI on the x60 but as reality shows, depending on what components you have DVI might give a better image.
But that doesn't have anything to do with VGA (or DVI) itself but rather with greedy manufacturers (or in the old days of bad hardware, thats not an excuse today). -
D-to-A conversion is done by the video card. The only case when its on a seperate circuit is when the video card is integrated.
I started this thread to find out if its possible to get a DVI out from ThinkPads, not to discuss the difference between VGA and DVI, but thank you for sharing your experiences. -
As I said, some LCDs do show a difference between VGA and DVI - and thats because of poor components in the monitor.
CRT might be an outdated technology but it still outperforms LCDs any day. And thats beside the point. I talked about CRTs to really show why VGA is not an issue...
But I suppose that that didn't go through.
Also, if you can find any benchmarks showing that using VGA results in lower results than DVI I'll be interested. Since you state that as a fact that shouldn't take more than a couple of secs.
(EDIT: maybe depends on what you count as a separate circuit, I may have got the translation wrong. But atleast it's handled by a separate chip/part that does not affect the performance of the graphic card itself.
Also, you say that it's on a separate circuit when the graphic card is integrated - guess that solves it since the X60(s) has an integrated graphic card)
And I feel sorry for web designers using 1024x768. I do design web-pages myself (as a hobby) and sure, you do certainly make sure that your sites fits 1024x768 and even 800x600, but thats a completely different thing.
But sure, everyone has their own needs and I actually know of people preferring 1152x864 over anything else (even on LCDs using 1280x1024), but they are not the majority.
If you want to talk quality you might start off with using a quality-panel. Not a TN panel as in the Samsung monitor you mentioned they are customized for low cost and speed. Dell actually uses, atleast in my monitor, S-IPS which is far superior when it comes to picture-quality - yes Dell do mix panel-types without saying anything which is pathetic, but you can't say that everything they do is bad (without looking rediculous).
The build quality has gotten outstanding reviews and that also shows on the price-tag. It's far more expensive than the samsung (costs about twice as much as the samsung), but you get what you pay for.
Makes sense that samsung wouldn't spend money on the VGA connections of an display that is primarily targeted towards gamers and to start with doesn't have 16.7M colors.
This is getting ridiculous. Do some research...
I'm sorry if I come off a bit harsch, I've had a long day and still didn't get nearly as much done as I should have.
DVI from X60 - Possible?
Discussion in 'Lenovo' started by gevorg, Nov 9, 2006.