I don't want to use third party applications (Powerstrip) to enable custom resolution. NVidia driver allows custom timings but won't accept it if it fails the test (I have 8400M GS). It always fails the test in my setup all the time. Though somehow I managed to get 1920x1080i but didn't get any image. My tv supports 1080i over D-Sub15 with other video cards.
So far I haven't had any problems using 1080i with component video adapter but wanted to save component video input for another component. I might just get RCA-to BNC video cable and connect it to a different input.
-
Maybe check the refresh rate? As you TV could be saying no to the refresh rate at that res?
-
umm since your tv accepts 1080i then thats only 1,280x720...1080p is the higher resolution.
-
1280x720 refers to the 720i and 720p resolutions.... Here also the same case with interlaced and progressive frames... -
i swear you are wrong
-
720p and 1080i will run at a lower resolution then 1080p. i have 3 lcd tv's...
its 1366×768 (or something similar), vs 1920 x 1080 -
lordofericstan Notebook Evangelist
Sorry, but ash is correct. 1080 means 1080 vertical lines, and the i and p mean interlace vs progressive. Interlace is where you have like a flip book, half the lines refresh then the other half refresh. Progressive is where they all refresh at the same time (like a computer). 1080i takes about the same bandwidth as 720p, because the 720p has 720 vertical lines that refresh 60 times a second, where 1080i has half of the lines refreshing 60 times a second.
Its really not fair to call 1080i a resolution because of this.
On a side note tv signals broadcasting at 1080p will have about the same quality as 1080i because 1080p refreshes at 30 frames per second (to conserve bandwidth), where the standard for 1080i is half the lines refreshing at 60 times per second. -
-
-
It may be that some TVs actually support only 720 vertical lines and scale down the 1080i resolution to 720i and display.... But there are HDTVs available supporting the 1080i and 1080p resolutions which actually display 1080 vertical lines for the same... And the wiki link should clarify even more I guess....
http://en.wikipedia.org/wiki/Image:Common_Video_Resolutions.svg -
If someone does not agree with a comment then go to Wikipedia or Google it. The information is readily available and ignorance is no excuse.
-
-
this thread went in the wrong direction.
Just to clarify, my TV supports 1080i natively. It is Pioneer Elite Pro-730HDi. If there was something better on the market in this size I would have replaced it yesterday (IMHO). 1080p plasmas from Pioneer line can not touch this CRT rear projection in terms of greyscale and color accuracy. Enough said on this topic and I don't want to turn this thread into another war.
As far as 1080p goes, I can bring multiple links talking about some Blu-Ray/HD-DVD players not outputing 1080p signal directly but rather recreating it from 1080i. 24p vs 60p, those are all different topics.
nizzy1115,
most CRT based HD displays support 1080i natively. In fact that is why ATSC has all different versions of 1080i broadcast. None of the fixed pixel displays
can support interlace scanning without deinterlacing it. Please do some readings before making any claims.
Anyhow I just asked a simple question, did anyone get 1080i working over D-Sub15 monitor cable with NVidia card?
1080i over monitor cable (D-Sub 15)
Discussion in 'Dell' started by tpaxadpom, Sep 19, 2007.