The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    DVI from X60 - Possible?

    Discussion in 'Lenovo' started by gevorg, Nov 9, 2006.

  1. gevorg

    gevorg Notebook Enthusiast

    Reputations:
    0
    Messages:
    15
    Likes Received:
    0
    Trophy Points:
    5
    Do any of the docks for X60 or X60s have DVI out? Also, are there any Thinkpads with DVI build-in?
     
  2. Qhs

    Qhs Notebook Evangelist

    Reputations:
    40
    Messages:
    666
    Likes Received:
    0
    Trophy Points:
    30
    I don't think IBMs have a DVI input. I have the X6 ultrabase and it only has a VGA. The only laptop company that I can think of at this moment, that has a DVI input built in, is Apple's Macbook Series.
     
  3. bigtrip

    bigtrip Notebook Enthusiast

    Reputations:
    6
    Messages:
    32
    Likes Received:
    0
    Trophy Points:
    15
    I have an R60 and have been trying to figure out if I can get a DVI output also... I called IBM and they said that even if I had the dock that has the DVI output, it wouldn't work. This could be due to my X1300 GPU but in any event they said the port would not work... You might want to talk to them because it could be different for the X60.

    The only other option that I've come across to get a DVI output is a video signal converter which are pretty expensive.
     
  4. Fred from NYC

    Fred from NYC Notebook Evangelist

    Reputations:
    64
    Messages:
    501
    Likes Received:
    1
    Trophy Points:
    30
    The X60(s) docks don't have a DVI connector. You could get a DVI PC card but it costs more than $200.
     
  5. gevorg

    gevorg Notebook Enthusiast

    Reputations:
    0
    Messages:
    15
    Likes Received:
    0
    Trophy Points:
    5
    Thats too bad, even Dells have one!

    DVI is very useful, you can use the 12.1" on the go, and/or the 20+" HD LCD at home/office.

    The Z series seem to have DVI connection, not sure how to config it though. Here is the description from Lenovo's main Z-series page:

    DVI connector pass-through support (select models)
     
  6. ZaZ

    ZaZ Super Model Super Moderator

    Reputations:
    4,982
    Messages:
    34,001
    Likes Received:
    1,415
    Trophy Points:
    581
    You don't need a DVI connecter to use a 20" LCD. A VGA hook up will work perfectly fine.
     
  7. Momo26

    Momo26 Notebook Deity NBR Reviewer

    Reputations:
    128
    Messages:
    1,378
    Likes Received:
    0
    Trophy Points:
    55
    So what exactly is the difference between DVI and VGA? One is digital the other analog?
     
  8. gevorg

    gevorg Notebook Enthusiast

    Reputations:
    0
    Messages:
    15
    Likes Received:
    0
    Trophy Points:
    5
    Yes, VGA will work too, but on big screens the difference in quality is very noticable, especially if you use high resolution (text more blurry, etc). Plus, extra D-to-A convertion introduces additional latency, which is definitely noticable in gaming and possibly noticable in video playback.

    Yes. DVI also supports much higher resolutions (including HDTV).
     
  9. tjoff

    tjoff Notebook Geek

    Reputations:
    1
    Messages:
    97
    Likes Received:
    0
    Trophy Points:
    15
    The quality isn't noticable if you've got good components.
    Many manufactures skimp on the VGA connection if it has both which can explain it though.

    Where have you read that it introduces extra latency?
    You'd probably not game on your x60 anyway but that extra latency is nothing compared to the delay in your eyes or the delay in the rest on monitor. Shouln't be anyway.


    Actually, VGA supports much higher resolutions than DVI.
    And 1680x1050 is not a high resolution. I use it on my 20" widescreen LCD without any problems (and an x60s).

    Even 1920x1080 (HD resolution) is higher than what DVI was originally supposed to handle. Dual-link DVI however might give VGA a match. But monitors requiring that doesn't even have VGA and the cheapest monitor requiring dual-link DVI is the apple/dell 30" monitors (if I haven't missed anything)...
     
  10. gevorg

    gevorg Notebook Enthusiast

    Reputations:
    0
    Messages:
    15
    Likes Received:
    0
    Trophy Points:
    5
    I have good components, tried various brands, and the quality is easily noticable as long as the display is big and the resolution used is high.

    Extra latency comes from additional D-to-A convertion that the video card has to do. Plus, it reduces the performance of the video card, especially if it is itegrated (like on X60). Unlike CRT displays, LCDs are prone to show "ghosting" in fast motion video/gaming.

    Most VGA displays do not support resolutions higher than 800x600.

    :) DVI doesn't need dual-link to exceed VGA, there is a clear difference even at 1280x1024 on my Samsung 910t.
     
  11. tjoff

    tjoff Notebook Geek

    Reputations:
    1
    Messages:
    97
    Likes Received:
    0
    Trophy Points:
    15
    1680x1050 is not a high resolution (not VGA atleast, DVI isn't that good but harware components have become better to compensate for the limitations of DVI).
    The size of the screen doesn't matter (more than that bigger screens more often have higher resolutions).

    I'd say that the extra latency is insignificant.
    And it shouldn't reduce the performance either. The D-to-A convertion is done by a separate circuit and has nothing to do with the graphic-card itself.
    Yes LCDs have ghosting, but it has nothing with VGA to do.
    And the ghosting is way more prominent than the D-to-A convertion.

    LCDs ghostin is measured in milliseconds, the convertion D-to-A convertion shouldn't take many microseconds. But I haven't got any numbers on that.
    (but fact is that CRT monitors using VGA are much faster than LCDs using DVI).

    The performance you are talking about are probably due to that you more often uses a higher resolution on your external monitor (1024x768 is a vey low resolution) and/or that you are having a dual-screen setup (laptop monitor + desktop monitor).

    Most VGA displays that can only display 800x600 is on the scrapeyard today...
    Most common today I think are ~17" CRTs that does atleast 1280x1024 (although people seldom realize that 1280x1024 doesn't have the right aspect ratio for an CRT).

    But still, what the displays handle isn't relevant. VGA supports higher resolutions than single-link DVI and has been around for much longer.
    VGA EASILY handles 1680x1050 in great quality.


    As I said thats probably because Samsung has used cheap components on the 910t since they figure that most will use DVI anyway and spending money on VGA won't benefit them.

    The only LCD with VGA and DVI that I have are the dell 2007 WFP, and I can't see any difference between my ATi X850XT card on my desktop computer using DVI or my x60s using VGA - at 1680x1050.

    And I used to have one 20" CRT and one 22" CRT, the latter supporting 2048x1536 (a resolution not even supported by single-link DVI) - although that was pushing the envelope for what my monitor can manage there are better monitors out there.

    This FW900 for instance:
    It supports a massive resolution of 2304x1440.
    http://www.hardforum.com/showthread.php?t=952788
    And that through VGA... Although probably with an quality-cable.


    Googled a bit and found this:
    http://www.gamepc.com/labs/view_content.asp?id=samsung910t&page=3
    It's apparent that Samsung have used really low-quality circuits for the analogue to digital conversion.
    The monitor is old and I hope that that can explain the great differences - because a good monitor today (or yesterday) should certainly see such large differences, even a bad monitor shouldn't but unfortunately thats not the case.

    So no, the differences between VGA and DVI is very small. And the largest differences depends on what components you have.
    So in theory you shouldn't even have a need for DVI on the x60 but as reality shows, depending on what components you have DVI might give a better image.
    But that doesn't have anything to do with VGA (or DVI) itself but rather with greedy manufacturers (or in the old days of bad hardware, thats not an excuse today).
     
  12. gevorg

    gevorg Notebook Enthusiast

    Reputations:
    0
    Messages:
    15
    Likes Received:
    0
    Trophy Points:
    5
    The bigger the screen/resolution the easier it is to see a difference between VGA and DVI.

    You might not see it, but some LCD displays do show a difference when changing between VGA and DVI. I don't know and I don't really care about CRT, its an outdated technology for consumer/business use. (Yes, graphic/medical professionals still use it, but not for long).

    D-to-A conversion is done by the video card. The only case when its on a seperate circuit is when the video card is integrated.

    1024x768 might be a very low resolution for your uses, but its actually a standard for many web designers. Most current consumer computers have a 1024x768 or 1280x1024 resolution.

    It is relevant, because its all about what comes out in the end. It doesn't matter how long VGA been around, its phasing out now.

    Only in CRT displays. I've used LCD displays from various manufacturers, and they all had worse quality and blurrier text when connected through VGA.

    Or its a good example that DVI is superior to VGA in consumer displays.

    Dell is the last display you'll look at if you want quality. They are the ones who use the cheapest components possible to meet the specs. They probably used cheap DVI components, so as you can see there is no difference.

    The difference between VGA and DVI depends on the display you're using. In quality digital displays (not analog), DVI is superior, especially if you're using a big screen (for serious DVI use, there are alway HD scalers). In my use and in many displays I used, DVI is usually superior (clear text, etc).

    I started this thread to find out if its possible to get a DVI out from ThinkPads, not to discuss the difference between VGA and DVI, but thank you for sharing your experiences.
     
  13. tjoff

    tjoff Notebook Geek

    Reputations:
    1
    Messages:
    97
    Likes Received:
    0
    Trophy Points:
    15
    Why not answer what I wrote?
    As I said, some LCDs do show a difference between VGA and DVI - and thats because of poor components in the monitor.

    CRT might be an outdated technology but it still outperforms LCDs any day. And thats beside the point. I talked about CRTs to really show why VGA is not an issue...
    But I suppose that that didn't go through.

    Read up on RAMDAC and come back to me.
    Also, if you can find any benchmarks showing that using VGA results in lower results than DVI I'll be interested. Since you state that as a fact that shouldn't take more than a couple of secs.

    (EDIT: maybe depends on what you count as a separate circuit, I may have got the translation wrong. But atleast it's handled by a separate chip/part that does not affect the performance of the graphic card itself.
    Also, you say that it's on a separate circuit when the graphic card is integrated - guess that solves it since the X60(s) has an integrated graphic card)

    Depends on where you live. Where I live practically noone uses 1024x768 anymore (haven't for years).
    And I feel sorry for web designers using 1024x768. I do design web-pages myself (as a hobby) and sure, you do certainly make sure that your sites fits 1024x768 and even 800x600, but thats a completely different thing.

    But sure, everyone has their own needs and I actually know of people preferring 1152x864 over anything else (even on LCDs using 1280x1024), but they are not the majority.

    You got one thing right (and as I already stated). The biggest con. of VGA is that some LCD monitors have bad components. The fault isn't in VGA itself. If you use quality-component though you won't see any difference in reasonable resolutions.

    You always get the cheapest you can get?

    Umm.. No.

    You're making me laugh.
    If you want to talk quality you might start off with using a quality-panel. Not a TN panel as in the Samsung monitor you mentioned they are customized for low cost and speed. Dell actually uses, atleast in my monitor, S-IPS which is far superior when it comes to picture-quality - yes Dell do mix panel-types without saying anything which is pathetic, but you can't say that everything they do is bad (without looking rediculous).
    The build quality has gotten outstanding reviews and that also shows on the price-tag. It's far more expensive than the samsung (costs about twice as much as the samsung), but you get what you pay for.
    Makes sense that samsung wouldn't spend money on the VGA connections of an display that is primarily targeted towards gamers and to start with doesn't have 16.7M colors.

    This is getting ridiculous. Do some research...

    Yeah, and you've gotten good answers. All I've done is trying to correct you (I'm not saying that I got everything right, but you have in my eyes missunderstood quite a bit when it comes to DVI and VGA) and answering questions that came along from others.


    I'm sorry if I come off a bit harsch, I've had a long day and still didn't get nearly as much done as I should have.