My brother is a physicist at MIT and these are some things he was telling me about luminescence, particularly in regards to monitors:
The luminescence of the 1440x900 screen is 250nits, whereas the luminescence of the 1680x1050 is 200 nits.
Now, brightness and luminescence are not the same. Luminescence is the power per unit area of an object. Brightness (what we observe) is a log function of the luminescence. For instance, doubling the luminescence of a screen will produce a log(2)brightness factor. Depending on which of the most accurate algorithms is used (there is still no general consensus among physicists) the most generous estimate comes out that the 1400x900 is only 11% brighter and the least generous is 4.2% brighter. The average across these is 8%.
The difference between these screens would only be noticeable in a direct side-by-side comparison or by extreme lcd connoisseurs.
To me, this makes the 33% upgrade in desktop real estate worth the $50 upgrade and the loss of 8% brightness.
Just thought I'd share.
IRO
-
yep, agreed.
-
I don't think there's any unresolved physics issue here; it's just that brightness is perceptual, so different people's perceptions can differ, and also it may depend on the lighting conditions in the room to some extent.
I agree though, I don't think a 250 nit screen would appear 25% brighter than a 200 nit one.
Even if the brightness and price were the same, I'd still pick the 1440x900 panel, since for me personally that resolution strikes a nice balance between real estate and readability/text size. But of course your preferences may differ. -
i don't need a physicist to tell me how bright a screen is, most of the time as long as it's brighter, it's what people want. I'm sure normal people can't tell whether something is 25% brighter or 50% brighter, how do we go about measuring that? I just like the option of knowing that it is indeed brighter and maybe it means one or two more "brightness" up button pushes
-
Good to know, thanks for the info.
-
actually, someone can correct my numbers if i'm wrong, but i'm pretty sure t61 lenovos are 200 nits right? and i'm thinking something like a sony or a toshiba is around 250-260 nits. I can definitely tell the difference between the two.
-
Interesting statistics. How's the 1280x800 compare?
-
-
The difference between 250 and 500 is a lot more than 200 and 250. =\
Those sonys/toshibas were probably designed with outdoor use in mind.
1520's WXGA is NOT 25% brighter than WSXGA
Discussion in 'Dell' started by Iro, Jul 3, 2007.